WO2019037606A1 - 基于ar技术的手术导航系统及方法 - Google Patents

基于ar技术的手术导航系统及方法 Download PDF

Info

Publication number
WO2019037606A1
WO2019037606A1 PCT/CN2018/099849 CN2018099849W WO2019037606A1 WO 2019037606 A1 WO2019037606 A1 WO 2019037606A1 CN 2018099849 W CN2018099849 W CN 2018099849W WO 2019037606 A1 WO2019037606 A1 WO 2019037606A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning
image
dynamic
real
patient
Prior art date
Application number
PCT/CN2018/099849
Other languages
English (en)
French (fr)
Inventor
刘洋
Original Assignee
刘洋
上海霖晏医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 刘洋, 上海霖晏医疗科技有限公司 filed Critical 刘洋
Publication of WO2019037606A1 publication Critical patent/WO2019037606A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points

Definitions

  • the invention relates to a surgical navigation system and method based on AR technology.
  • the working principle of the navigation technology is: installing a signal near the surgical site of the patient and the surgical instrument
  • the device usually uses infrared rays as a transmission source and a charge coupled device (CCD) camera as a receiver, and uses the emitted signal to track the position of the patient's bone and the position and motion track of the surgical instrument, and pass the information through
  • CCD charge coupled device
  • the display is presented to the doctor; the fluoroscopy of the surgical site of the patient is performed during the operation, and the fluoroscopic image is combined with the obtained image of the patient's bone position and the position of the surgical instrument to obtain a navigation image for the surgeon to perform the operation.
  • the doctor Since the navigation system described above displays the positioning and guiding information on the system screen, the doctor performs a surgical operation by observing the image on the navigation screen, so that the navigation information is separated from the surgical scene.
  • the doctor In order to observe the position of the surgical instrument relative to the patient's anatomy, the doctor has to switch back and forth between the patient's surgical site and the navigation screen. This is not conducive to the doctor's attention to the surgical site of the patient, which will interfere with the surgical process. Increase navigation and positioning errors.
  • the doctor mainly uses the tomographic image displayed on the screen of the navigation system to superimpose the information of the surgical instrument for intraoperative positioning, and these images do not reflect the current position of the patient well, so that the doctor cannot intuitively understand the surgical instruments in the real surgical scene.
  • the spatial positional relationship between the actual patient anatomy makes the function of the surgical navigation system not fully utilized.
  • the technical problem to be solved by the present invention is to overcome the defect that the navigation information in the Augmented Reality (AR) glasses in the prior art cannot be accurately superimposed on the actual surgical scene, and provide a surgical navigation system based on the AR technology and method.
  • AR Augmented Reality
  • a surgical navigation system based on AR technology comprising a scanning device, a processor, a dynamic positioning acquisition device and AR glasses;
  • the scanning device is configured to scan a patient to obtain scanned image data and send it to the processor before surgery;
  • the processor is configured to receive the scanned image data and generate a three-dimensional simulated image of the patient;
  • the dynamic positioning acquisition device is configured to acquire a positioning image and first real-time dynamic data of a patient in a dynamic positioning acquisition device coordinate system during surgery and send the same to the processor, and is further configured to acquire the AR glasses in the dynamic Locating and acquiring second real-time dynamic data in the device coordinate system and transmitting to the processor;
  • the processor is further configured to statically register the three-dimensional simulated image in the dynamic positioning acquiring device coordinate system according to the positioning image, and perform static registration of the three-dimensional simulated image according to the first real-time dynamic data.
  • the patient performs dynamic registration;
  • the processor acquires coordinate data of the patient in the AR glasses coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and combines the three-dimensional simulated image after dynamic registration to generate a patient in the actual environment
  • Real-time coincident three-dimensional navigation images are sent to the AR glasses
  • the static registration is to acquire the coordinates of the actual patient in the dynamic positioning acquisition device coordinate system
  • the dynamic registration is to acquire the dynamic coordinates of the patient in the dynamic positioning acquisition device coordinate system.
  • the AR glasses are used to receive and display the three-dimensional navigation image.
  • the dynamic positioning acquisition device includes an optical dynamic tracking device and a positioning imaging device, and the coordinates of the positioning imaging device in the dynamic positioning acquisition device coordinate system are a preset coordinate;
  • the optical dynamic tracking device is configured to acquire the first real-time dynamic data and the second real-time dynamic data
  • the positioning imaging device is configured to acquire the positioning image
  • the processor is further configured to generate a coordinate parameter according to the preset coordinate and the positioning image, and perform the static registration according to the coordinate parameter.
  • the dynamic positioning acquisition device further includes a plurality of marking components, the marking component comprising a positioning component and a dynamic tracking component; the positioning component and the dynamic tracking component are attached to a predetermined range from a surgical site of the patient
  • the inner tracking body is further disposed on the AR glasses;
  • the scanning device is configured to acquire scanned image data including an image of a positioning component before surgery
  • the positioning imaging device is configured to acquire a positioning image including a positioning component image and a dynamic tracking component image during surgery;
  • the optical dynamic tracking device is configured to identify the dynamic tracking component during surgery to acquire the first real-time dynamic data and the second real-time dynamic data.
  • the front side pattern and the reverse side pattern of the positioning component are the same;
  • the front surface pattern is a coating of barium sulfate
  • the scanning device is configured to scan the scanned image data including the coating mark before the operation, and the scanning device comprises a computed tomography (CT) device.
  • CT computed tomography
  • Barium sulfate can be developed by X-ray of CT equipment;
  • the reverse pattern is printed on the patient's skin to form a watermark pattern when the positioning component is attached, and the positioning imaging apparatus is configured to acquire a positioning image including the watermark pattern during surgery, and after the patient completes the CT, The positioning mark is removed, and the watermark pattern of the reverse pattern can be left on the skin for a week or so.
  • the dynamic tracking component is further disposed on the surgical instrument
  • the positioning imaging device is configured to acquire a positioning image including a positioning component image, a dynamic tracking component image, and a surgical instrument image during surgery;
  • the optical dynamic tracking device is further configured to identify the dynamic tracking component during surgery to obtain third real-time dynamic data of the surgical instrument;
  • the processor is configured to dynamically register the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data and the third real-time dynamic data.
  • a surgical navigation method based on AR technology which is implemented by using a surgical navigation system of any combination of the above various preferences, including:
  • the scanning device scans the patient prior to surgery to acquire scanned image data and sends to the processor, the processor receiving the scanned image data and generating a three-dimensional simulated image of the patient;
  • the dynamic positioning acquisition device acquires a positioning image and first real-time dynamic data of the patient in the coordinate system of the dynamic positioning acquisition device during the operation and sends the same to the processor, and is also used to acquire the dynamic positioning acquisition of the AR glasses. Second real-time dynamic data in the device coordinate system and sent to the processor;
  • the processor statically registers the three-dimensional simulated image in the dynamic positioning acquisition device coordinate system according to the positioning image, and dynamically registers the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data. Dynamic registration;
  • the processor acquires coordinate data of the patient in the AR glasses coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and combines the three-dimensional simulated image after dynamic registration to generate a patient in the actual environment a three-dimensional navigation image that coincides in real time and sent to the AR glasses;
  • the AR glasses receive and display the three-dimensional navigation image.
  • the dynamic positioning acquisition device includes an optical dynamic tracking device and a positioning imaging device, and the coordinates of the positioning imaging device in the dynamic positioning acquisition device coordinate system are a preset coordinate;
  • the optical dynamic tracking device acquires the first real-time dynamic data and the second real-time dynamic data, and the positioning imaging device is configured to acquire the positioning image;
  • the processor generates coordinate parameters according to the preset coordinates and the positioning image, and performs the static registration according to the coordinate parameters.
  • the dynamic positioning acquisition device further includes a plurality of marking components, the marking component comprising a positioning component and a dynamic tracking component; the positioning component and the dynamic tracking component are attached to a predetermined range from a surgical site of the patient
  • the inner tracking body is further disposed on the AR glasses;
  • the scanning device acquires scanned image data including an image of a positioning component before surgery
  • the positioning imaging device acquires a positioning image including a positioning component image and a dynamic tracking component image during surgery;
  • the optical dynamic tracking device identifies the dynamic tracking component during surgery to acquire the first real-time dynamic data and the second real-time dynamic data.
  • the front side pattern and the reverse side pattern of the positioning component are the same;
  • the front surface pattern is a coating of barium sulfate, and the scanning device scans the scanned image data including the coating mark before the operation;
  • the reverse pattern is printed on the patient's skin to form a watermark pattern when the positioning member is attached, and the positioning imaging apparatus acquires a positioning image including the watermark pattern during surgery.
  • the dynamic tracking component is further disposed on the surgical instrument
  • the positioning imaging device acquires a positioning image including a positioning component image, a dynamic tracking component image, and a surgical instrument image during surgery;
  • the optical dynamic tracking device identifies the dynamic tracking component during surgery to acquire third real-time dynamic data of the surgical instrument
  • the processor dynamically registers the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data and the third real-time dynamic data.
  • the positive progress of the present invention is that the surgical navigation system of the present invention acquires a three-dimensional simulated image of a patient through a scanning device, and superimposes real-time dynamic data of the patient and the surgical instrument into a three-dimensional simulated image in real time by means of a dynamic positioning acquiring device, and then combines the AR glasses.
  • the real-time dynamic data enables the images presented in the AR glasses to be accurately superimposed with the patients in the real environment, and more intuitively presents the spatial position map of the surgical instruments relative to the patient's anatomy, which better assists the doctor in performing the operation.
  • FIG. 1 is a structural block diagram of a surgical navigation system based on AR technology according to Embodiment 1 of the present invention.
  • FIG. 2 is a structural block diagram of a surgical navigation system based on AR technology according to Embodiment 2 of the present invention.
  • FIG. 3 is a structural block diagram of a surgical navigation system based on AR technology according to Embodiment 3 of the present invention.
  • FIG. 4 is a structural block diagram of a surgical navigation system based on AR technology according to Embodiment 5 of the present invention.
  • FIG. 5 is a flowchart of a surgical navigation method based on AR technology according to Embodiment 6 of the present invention.
  • FIG. 6 is a flowchart of generating a three-dimensional navigation image in a surgical navigation method based on AR technology according to Embodiment 6 of the present invention.
  • FIG. 7 is a flowchart of a surgical navigation method based on AR technology according to Embodiment 7 of the present invention.
  • FIG. 8 is a flowchart of generating a three-dimensional navigation image in a surgical navigation method based on AR technology according to Embodiment 7 of the present invention.
  • FIG. 9 is a flowchart of a surgical navigation method based on AR technology according to Embodiment 8 of the present invention.
  • FIG. 10 is a flowchart of a surgical navigation method based on AR technology according to Embodiment 9 of the present invention.
  • FIG. 11 is a flowchart of a surgical navigation method based on AR technology according to Embodiment 10 of the present invention.
  • FIG. 12 is a flowchart of generating a three-dimensional navigation image in a surgical navigation method based on AR technology according to Embodiment 10 of the present invention.
  • a surgical navigation system based on AR technology comprising a scanning device 1 , a processor 2 , a dynamic positioning acquiring device 3 and an AR glasses 4 , wherein the scanning device 1 , the dynamic positioning acquiring device 3 and the AR glasses 4 is in communication with the processor 2, respectively.
  • the scanning device 1 is configured to scan the patient 5 before the operation to acquire the scanned image data and send it to the processor 2.
  • the scanning device 1 in this embodiment may be a CT device, and the patient may generate multiple tomographic images after being scanned by the CT device. These tomographic images are the above-described scanned image data.
  • the processor 2 is configured to receive the scanned image data, and generate a three-dimensional simulated image of the patient 5 according to the scanned image data. It should be noted that the generated three-dimensional simulated image is simulated in a one-to-one ratio with the patient 5, and through the three-dimensional simulation. The image can observe the bones, blood vessels, muscle tissue, and the like of the patient 5.
  • the dynamic positioning acquisition device 3 can acquire the initial static positioning image and the first real-time dynamic data of the patient 5 in the dynamic positioning acquisition device coordinate system during the operation. And sending the information to the processor 2, the processor 2 processes the positioning image by the machine vision algorithm, acquires the target image from the positioning image, and calculates the coordinate parameter of the target image in the coordinate system of the dynamic positioning acquiring device.
  • the dynamic positioning acquisition device 3 is further configured to acquire second real-time dynamic data of the AR glasses 4 in the dynamic positioning acquisition device coordinate system and send the same to the processor 2.
  • a marking component that can be tracked by the dynamic positioning acquiring device 3 can be disposed on the AR glasses, and the dynamic positioning acquiring device 3 obtains the second real-time dynamic of the AR glasses 4 in the coordinate system of the dynamic positioning acquiring device by tracking the marking component. data.
  • the processor 2 mentioned above is further configured to statically register the three-dimensional simulated image in the dynamic positioning acquisition device coordinate system according to the coordinate parameter of the patient 5, and to dynamically register the three-dimensional simulated image with the patient according to the first real-time dynamic data. Perform dynamic registration. It should be explained that the static registration is to put the three-dimensional simulation image stored in the processor 2 into the dynamic positioning acquisition device coordinate system, so that the coordinates of the three-dimensional simulation image are the same as the coordinates of the patient 5 in the actual environment. The dynamic registration is based on the acquired first real-time dynamic data, so that the three-dimensional simulated image and the patient 5 in the real environment move synchronously.
  • the present embodiment can real-time the virtual imaging in the AR glasses 4 with the patient 5 in the real environment in real time, as follows:
  • the processor 2 acquires the coordinate data of the patient 5 in the AR coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generates a three-dimensional navigation image that coincides with the real-time patient in real time in combination with the dynamically registered three-dimensional simulated image. Sended to the AR glasses 4, the AR glasses 4 are used to receive and display a three-dimensional navigation image.
  • the AR glasses 4 are taken as the viewpoint, and the AR coordinate system is constructed with the viewpoint as the coordinate origin, the coordinate data of the patient 5 in the AR coordinate system is calculated in real time, and the registered three-dimensional simulated image is moved into the viewpoint.
  • the position represented by the coordinate data In the actual situation, the coordinates of the patient 5 in the dynamic positioning acquisition device coordinate system can be acquired, and the coordinates of the AR glasses 4 in the dynamic positioning acquisition device coordinate system can also be acquired, which is not difficult to understand, and is processed by the processor.
  • the coordinates of the patient 5 in the dynamic positioning acquisition device coordinate system are subtracted from the coordinates of the AR glasses 4 in the dynamic positioning acquisition device coordinate system, and the coordinates of the patient 5 in the AR coordinate system are obtained, so that the processor 2 will be registered.
  • the three-dimensional simulated image is placed in the corresponding coordinates of the AR coordinate system to generate a three-dimensional navigation image, and the three-dimensional navigation image is coincident with the patient 5 in the actual environment in real time.
  • the processor 2 indirectly calculates the patient 5 in the AR coordinate system based on the coordinates of the AR glasses 4 and the patient 5 in the dynamic positioning acquisition device coordinate system. Coordinates, the registered three-dimensional simulation image is placed in the AR coordinate system, ensuring that the virtual image seen by the doctor in the AR glasses 4 coincides with the patient in the real environment in real time, so that the image presented in the AR glasses 4 Accurately superimposed with the patient 5 in the real environment, so that the doctor can observe the patient's physical and virtual imaging, which is conducive to the operation.
  • the dynamic positioning acquisition device 3 includes an optical dynamic tracking device 32 and a positioning imaging device 31, such as NDI optical dynamics.
  • the tracking device and the binocular camera, the optical dynamic tracking device 32 is configured to acquire the first real-time dynamic data and the second real-time dynamic data; and the positioning imaging device 31 is configured to acquire the positioning image.
  • the processor 2 is further configured to generate a coordinate parameter according to the preset coordinates and the positioning image, where the coordinate parameter represents the position of the patient in the dynamic positioning acquisition device coordinate system.
  • the processor 2 will statically register the three-dimensional simulated image in the dynamic positioning acquisition device coordinate system according to the coordinate parameters.
  • the coordinates of the positioning imaging device 31 in the coordinate system of the dynamic positioning acquiring device are a preset coordinate, that is, the position of the positioning imaging device 31 in the coordinate system of the dynamic positioning acquiring device is known, and the embodiment is NDI optical.
  • the dynamic tracking device and the binocular camera are specifically described as an example.
  • the binocular camera is located within the effective tracking range of the NDI optical dynamic tracking device, and the NDI optical dynamic tracking device can acquire the coordinate position of the binocular camera.
  • the specific implementation manner is that the positioning mark that can be tracked by the NDI optical dynamic tracking device can be installed on the binocular camera, and the positioning mark can be an active mark or a passive mark, which is not specifically limited in this embodiment.
  • the NDI optical dynamic tracking device acquires the coordinates of the binocular camera in the coordinate system of the dynamic positioning acquiring device by tracking the infrared light emitted by the positioning mark or the infrared light reflected by the positioning mark, and transmits the coordinates of the binocular camera in the coordinate system of the dynamic positioning acquiring device to the processor 2.
  • the optical dynamic tracking device 32 also has the function of positioning ranging, but the precision is smaller than that of the professional positioning imaging device. Therefore, in order to ensure the accuracy of the dynamic and static data acquired during the operation, the optical dynamics are selected in this embodiment.
  • Two devices, the tracking device and the positioning camera device obtain the first real-time dynamic data of the patient 5 and the second real-time dynamic data of the AR glasses 4 by the optical dynamic tracking device, and acquire the positioning image of the patient 5 by the positioning camera device, wherein the positioning image is acquired by the positioning camera device
  • the coordinates of the device in the coordinate system of the dynamic positioning acquisition device are a preset coordinate, and the static registration can be completed by combining the positioning image.
  • the AR-based surgical navigation system of the present embodiment is further improved on the basis of Embodiment 2.
  • the dynamic positioning acquisition device 3 further includes a plurality of marking components, and the marking component includes a positioning component 61 and dynamic tracking.
  • the component 62; the positioning component 61 and the dynamic tracking component 62 are attached to the body within a predetermined range from the surgical site of the patient, and the dynamic tracking component 62 is further disposed on the AR glasses 4;
  • the scanning device 1 is configured to acquire scanned image data including an image of a positioning component before surgery;
  • the positioning imaging device 31 is configured to acquire a positioning image including a positioning component image and a dynamic tracking component image during surgery;
  • the optical dynamic tracking device 32 is configured to identify the dynamic tracking component during the operation to obtain the first real-time dynamic data and the second real-time dynamic data.
  • the positioning component can be recognized by the positioning imaging device, and the dynamic tracking component can be recognized by the optical dynamic tracking device.
  • the AR-based surgical navigation system of the present embodiment is further improved on the basis of Embodiment 3.
  • the main improvement is that the front surface pattern and the reverse surface pattern of the positioning member are the same, and the front surface pattern is a coating of barium sulfate, and the scanning device is used. Scanning image data containing the scratches is obtained by scanning before surgery; the reverse pattern is printed on the patient's skin to form a watermark pattern when the positioning component is attached, and the positioning imaging apparatus is used to acquire the positioning image containing the watermark pattern during the operation.
  • the patient scan and the operation are not performed simultaneously, and there is often a gap in the middle, and the interval time may be three days, five days or one week, during which the living activities performed by the patient, for example, Bathing, sleeping, etc., these behaviors may cause the positioning component to be offset, so that the positioning image of the positioning component image acquired by the positioning imaging device is inaccurate during the operation, thereby affecting the positioning of the three-dimensional simulated image in the coordinate system of the dynamic positioning acquiring device.
  • the reverse pattern can be printed on the patient's skin to form a watermark pattern when the positioning member is attached, and the watermark pattern can be kept on the patient's body surface for a period of time to avoid offset.
  • the positioning component is in the form of a film, and the front and back surfaces are provided with the same pattern, and the pattern may be a grid shape, a concentric circle, an equidistant point, etc., and the positioning component is provided with one or more special points for The positioning reference is made during the operation, and the front side of the positioning component is coated with barium sulfate.
  • the barium sulfate can be recognized by the CT device by X-ray development of the CT device.
  • the AR-based surgical navigation system of the present embodiment is further improved on the basis of Embodiment 3. Specifically, as shown in FIG. 4, the dynamic tracking component is further disposed on the surgical instrument 7;
  • the positioning imaging device 31 is configured to acquire a positioning image including a positioning component image, a dynamic tracking component image, and a surgical instrument image during surgery;
  • the optical dynamic tracking device 32 is further configured to identify the dynamic tracking component 62 during surgery to acquire third real-time dynamic data of the surgical instrument 7;
  • the processor 2 is configured to dynamically register the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data and the third real-time dynamic data.
  • the surgical navigation system also includes an alarm device, and the alarm device is communicatively coupled to the processor.
  • the three-dimensional navigation image is marked with a reminding part, the reminding part may be an important blood vessel or a nerve, etc.
  • the processor calculates the minimum distance between the surgical instrument and the reminding part in real time, when the minimum distance between the surgical instrument and the reminding part is less than or When the preset value is equal to the preset value, the processor controls the alarm device to send a reminder signal to remind the doctor to perform careful operation.
  • the processor controls the alarm device to issue a warning signal. Remind the doctor that the current operation wind is high, and improve the safety factor of the operation through the above alarm device.
  • the dynamic data of the surgical instrument is acquired in real time during the operation and incorporated into the three-dimensional simulated image, so that the doctor can more intuitively present the spatial position map of the surgical instrument relative to the patient's anatomy. Better assist the doctor in the operation.
  • a surgical navigation method based on AR technology is characterized in that the surgical navigation method is implemented by using the surgical navigation system in Embodiment 1, and includes the following steps:
  • Step 101 The scanning device scans the patient before the operation to acquire the scanned image data and sends the scanned image data to the processor.
  • Step 102 The processor receives the scanned image data and generates a three-dimensional simulated image of the patient.
  • Step 103 The dynamic positioning acquiring device acquires the positioning image and the first real-time dynamic data of the patient in the coordinate system of the dynamic positioning acquiring device during the operation and sends the first real-time dynamic data to the processor, and acquires the second real-time of the AR glasses in the coordinate system of the dynamic positioning acquiring device. Dynamic data is sent to the processor;
  • Step 104 The processor generates a three-dimensional navigation image according to the received three-dimensional simulation image, the positioning image, the first real-time dynamic data, and the second real-time dynamic data, and sends the three-dimensional navigation image to the AR glasses.
  • Step 1041 Perform static registration on the three-dimensional simulation image in the coordinate system of the dynamic positioning acquiring device according to the positioning image.
  • Step 1042 Dynamically register the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data.
  • Step 1043 Acquire coordinate data of the patient in the AR glasses coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generate a three-dimensional navigation image that coincides with the real-time patient in the actual environment in combination with the dynamically registered three-dimensional simulated image. .
  • Step 105 The AR glasses receive and display a three-dimensional navigation image.
  • the three-dimensional simulated image of the patient is acquired by the scanning device, and the real-time dynamic data of the patient is superimposed into the three-dimensional simulated image in real time by the dynamic positioning acquiring device, and then combined with the real-time dynamic data of the AR glasses, so that the AR glasses are presented in the AR glasses.
  • the image is accurately superimposed on the patient in the real environment, and it is better to assist the doctor in the operation.
  • the dynamic positioning acquisition device includes an optical dynamic tracking device and a positioning imaging device, and the positioning imaging device is dynamically positioned. Obtaining coordinates in the device coordinate system as a preset coordinate;
  • Step 103 is replaced with step 103a:
  • Step 103a The positioning imaging device acquires a positioning image of the patient in the coordinate system of the dynamic positioning acquiring device during the operation and sends the positioning image to the processor, and the optical dynamic tracking device acquires the first real-time dynamic data of the patient in the coordinate system of the dynamic positioning acquiring device during the operation. And the second real-time dynamic data of the AR glasses and sent to the processor.
  • Step 1041 is replaced with step 1041a:
  • the processor generates a coordinate parameter according to the preset coordinate and the positioning image, and performs static registration according to the coordinate parameter.
  • the optical dynamic tracking device and the positioning imaging device are combined to obtain the first real-time dynamic data of the patient and the first of the AR glasses.
  • the second real-time dynamic data is obtained by the positioning camera device, wherein the positioning of the positioning device in the coordinate system of the dynamic positioning acquisition device is a preset coordinate, and the static registration can be completed by combining the positioning image.
  • the dynamic positioning acquisition device further includes a plurality of marking components, the marking component includes a positioning component and a dynamic tracking component;
  • the component and the dynamic tracking component are attached to the body within a predetermined range from the surgical site of the patient, and the dynamic tracking component is also disposed on the AR glasses;
  • Step 101 is replaced with step 101a:
  • Step 101a the scanning device scans the patient before the operation to acquire scanned image data including the image of the positioning component and sends the image to the processor;
  • Step 103a is replaced with step 103b:
  • Step 103b The positioning imaging device acquires, in the operation, the positioning image including the positioning component image and the dynamic tracking component image in the dynamic positioning acquiring device coordinate system and sends the positioning image to the processor, and the optical dynamic tracking device identifies the dynamic tracking component during the operation. Acquiring the first real-time dynamic data of the intraoperative patient in the dynamic positioning acquisition device coordinate system and the second real-time dynamic data of the AR glasses and transmitting to the processor.
  • the positioning component can be recognized by the positioning imaging device, and the dynamic tracking component can be recognized by the optical dynamic tracking device.
  • the AR-based surgical navigation method of the present embodiment is further improved on the basis of Embodiment 8, wherein the front surface pattern and the reverse surface pattern of the positioning member are the same; the front surface pattern is a coating of barium sulfate, and the reverse surface pattern is Printing a positioning component to form a watermark pattern on the patient's skin;
  • Step 101a is replaced with step 101b:
  • Step 101b the scanning device scans the patient before the operation to obtain scan image data containing the scratch marks and sends the data to the processor;
  • Step 103b is replaced with step 103c:
  • Step 103c The positioning imaging device acquires, in the operation, the positioning image including the watermark pattern and the dynamic tracking component image in the coordinate system of the dynamic positioning acquiring device, and sends the positioning image to the processor, and the optical dynamic tracking device identifies the dynamic tracking component during the operation to obtain During operation, the patient dynamically locates the first real-time dynamic data in the device coordinate system and the second real-time dynamic data of the AR glasses and sends them to the processor.
  • the positioning component is in the form of a film, and the front and back surfaces are provided with the same pattern, and the pattern may be a grid shape, a concentric circle, an equidistant point, etc., and the positioning component is provided with one or more special points for During the operation, the positioning reference is made.
  • the front side of the positioning component is coated with barium sulfate.
  • the barium sulfate can be recognized by the CT device by X-ray development of the CT device.
  • the positioning component needs to be attached to the patient for a long time. In this case, the positioning marker component is easily moved and inconvenient, and the watermark pattern is retained on the patient by printing, and the watermark pattern can be retained for one week.
  • the AR-based surgical navigation method of the present embodiment is further improved on the basis of Embodiment 8, and the dynamic tracking component is further disposed on the surgical instrument;
  • Step 103b is replaced with step 103d:
  • Step 103d The positioning imaging device acquires, in the operation, the positioning image including the positioning component image, the dynamic tracking component image, and the surgical instrument image in the coordinate system of the dynamic positioning acquiring device, and sends the positioning image to the processor, and the optical dynamic tracking device recognizes in the operation.
  • the dynamic tracking component acquires first real-time dynamic data of the intraoperative patient in the dynamic positioning acquisition device coordinate system, second real-time dynamic data of the AR glasses, and third real-time dynamic data of the surgical instrument and sends the same to the processor.
  • Step 1042 is replaced with step 1042b:
  • Step 1042b Dynamically register the statically registered three-dimensional simulated image with the patient according to the first real-time dynamic data and the third real-time dynamic data.
  • the dynamic data of the surgical instrument is acquired in real time during the operation and incorporated into the three-dimensional simulated image, so that the doctor can more intuitively present the spatial position map of the surgical instrument relative to the patient's anatomy. Better assist the doctor in the operation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

一种基于AR技术的手术导航系统及方法,手术导航系统包括扫描装置(1)、处理器(2)、动态定位获取装置(3)和AR眼镜(4);扫描装置(1)在手术前对患者(5)进行扫描以获取扫描图像数据并发送至处理器(2);处理器(2)接收扫描图像数据并生成患者(5)的三维模拟图像;动态定位获取装置(3)在手术中获取患者(5)在动态定位获取装置(3)坐标系中的定位图像和第一实时动态数据,还用于获取AR眼镜(4)在动态定位获取装置(3)坐标系中的第二实时动态数据,并发送至处理器(2);处理器(2)根据三维模拟图像、定位图像、第一实时动态数据和第二实时动态数据生成三维导航图像并发送至AR眼镜(4);AR眼镜(4)接收并显示三维导航图像。能够在AR眼镜(4)中动态呈现手术过程并与患者(5)精准叠加。

Description

基于AR技术的手术导航系统及方法
本申请要求于2017年08月21日提交中国专利局、申请号为201710719544.7、申请名称为“基于AR技术的手术导航系统及方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及一种基于AR技术的手术导航系统及方法。
背景技术
现有技术中,为了使医生能够清晰的了解手术器械相对病人解剖结构的位置,普遍采用计算机辅助导航技术,该导航技术的工作原理为:在患者的手术部位附近和手术器械上安装能够发出信号的装置,通常采用红外线作为发射源、电荷耦合元件(charge coupled device,CCD)相机为接收器,利用发出的信号对患者的骨骼位置和手术器械的位置以及运动轨迹进行跟踪,同时将这些信息通过显示器展示给医生;在术中进行患者手术部位的X线透视,并将透视图像与得到的患者骨骼位置和手术器械位置图像进行合成,从而得到医生进行手术采用的导航图像。
由于上述导航系统是将定位和引导信息显示在系统屏幕上,医生通过观察导航屏幕上的图像来进行手术操作,使得导航信息与手术场景相分离。术中,医生为了观察手术器械相对于病人解剖结构的位置,不得不在病人手术部位和导航屏幕之间来回切换视野,这不利于医生将注意力集中于病人手术部位进行手术,会干扰手术进程并加大导航定位误差。
另一方面,导航信息不易理解。医生主要利用导航系统屏幕上显示的叠加了手术器械信息的断层图像来进行术中定位,而这些图像并不能很好地反映出病人当前体位,使得医生不能直观地理解真实手术场景中手术器械与实际病人解剖结构间的空间位置关系,造成手术导航系统的作用不能得到充分发挥。
发明内容
本发明要解决的技术问题是为了克服现有技术中增强现实(Augmented  Reality,AR)眼镜中的导航信息与实际手术场景不能精准的动态叠加的缺陷,提供一种基于AR技术的手术导航系及方法。
本发明是通过下述技术方案解决上述技术问题:
一种基于AR技术的手术导航系统,包括扫描装置、处理器、动态定位获取装置和AR眼镜;
所述扫描装置用于在手术前对患者进行扫描以获取扫描图像数据并发送至所述处理器;
所述处理器用于接收所述扫描图像数据并生成所述患者的三维模拟图像;
所述动态定位获取装置用于在手术中获取患者在动态定位获取装置坐标系中的定位图像和第一实时动态数据并发送至所述处理器,还用于获取所述AR眼镜在所述动态定位获取装置坐标系中的第二实时动态数据并发送至所述处理器;
所述处理器还用于根据所述定位图像将所述三维模拟图像在所述动态定位获取装置坐标系中进行静态注册,并根据所述第一实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准;
所述处理器根据所述第二实时动态数据和所述第一实时动态数据获取所述患者在AR眼镜坐标系中的坐标数据,并结合动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像并发送至所述AR眼镜,所述静态注册是获取实际患者在动态定位获取装置坐标系的坐标,所述动态配准是获取患者在动态定位获取装置坐标系的动态坐标。
所述AR眼镜用于接收并显示所述三维导航图像。
较佳地,所述动态定位获取装置包括光学动态追踪设备和定位摄像设备,所述定位摄像设备在所述动态定位获取装置坐标系中的坐标为一预设坐标;
所述光学动态追踪设备用于获取所述第一实时动态数据和所述第二实时动态数据;
所述定位摄像设备用于获取所述定位图像;
所述处理器还用于根据所述预设坐标和所述定位图像生成坐标参数,并根据所述坐标参数进行所述静态注册。
较佳地,所述动态定位获取装置还包括若干个标记部件,所述标记部件包 括定位部件和动态追踪部件;所述定位部件和所述动态追踪部件贴附在距离患者手术部位一定预设范围内的身体上,所述动态追踪部件还设置在所述AR眼镜上;
所述扫描装置用于在手术前获取包含有定位部件图像的扫描图像数据;
所述定位摄像设备用于在手术中获取包含有定位部件图像和动态追踪部件图像的定位图像;
所述光学动态追踪设备用于在手术中识别所述动态追踪部件以获取所述第一实时动态数据和所述第二实时动态数据。
较佳地所述定位部件的正面图案和反面图案相同;
所述正面图案为硫酸钡的涂痕,所述扫描装置用于在手术前扫描获取包含有所述涂痕的扫描图像数据,所述扫描装置包括计算机断层射线扫描(Computed Tomography,CT)设备,硫酸钡能够被CT设备的X线显影;
所述反面图案在贴附所述定位部件时印制在患者皮肤上形成水印图案,所述定位摄像设备用于在手术中获取包含有所述水印图案的定位图像,患者做完CT后,将定位标记移除,反面图案的水印图案可以留在皮肤上一周左右的时间。
较佳地,所述动态追踪部件还设置在手术器械上;
所述定位摄像设备用于在手术中获取包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像;
所述光学动态追踪设备还用于在手术中识别所述动态追踪部件以获取所述手术器械的第三实时动态数据;
所述处理器用于根据所述第一实时动态数据和所述第三实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准。
一种基于AR技术的手术导航方法,所述手术导航方法利用如上各优选项任意组合的手术导航系统实现,包括:
所述扫描装置在手术前对患者进行扫描以获取扫描图像数据并发送至所述处理器,所述处理器接收所述扫描图像数据并生成所述患者的三维模拟图像;
所述动态定位获取装置在手术中获取患者在动态定位获取装置坐标系中 的定位图像和第一实时动态数据并发送至所述处理器,还用于获取所述AR眼镜在所述动态定位获取装置坐标系中的第二实时动态数据并发送至所述处理器;
所述处理器根据所述定位图像将所述三维模拟图像在所述动态定位获取装置坐标系中进行静态注册,并根据所述第一实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准;
所述处理器根据所述第二实时动态数据和所述第一实时动态数据获取所述患者在AR眼镜坐标系中的坐标数据,并结合动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像并发送至所述AR眼镜;
所述AR眼镜接收并显示所述三维导航图像。
较佳地,所述动态定位获取装置包括光学动态追踪设备和定位摄像设备,所述定位摄像设备在所述动态定位获取装置坐标系中的坐标为一预设坐标;
所述光学动态追踪设备获取所述第一实时动态数据和所述第二实时动态数据,所述定位摄像设备用于获取所述定位图像;
所述处理器根据所述预设坐标和所述定位图像生成坐标参数,并根据所述坐标参数进行所述静态注册。
较佳地,所述动态定位获取装置还包括若干个标记部件,所述标记部件包括定位部件和动态追踪部件;所述定位部件和所述动态追踪部件贴附在距离患者手术部位一定预设范围内的身体上,所述动态追踪部件还设置在所述AR眼镜上;
所述扫描装置在手术前获取包含有定位部件图像的扫描图像数据;
所述定位摄像设备在手术中获取包含有定位部件图像和动态追踪部件图像的定位图像;
所述光学动态追踪设备在手术中识别所述动态追踪部件以获取所述第一实时动态数据和所述第二实时动态数据。
较佳地,所述定位部件的正面图案和反面图案相同;
所述正面图案为硫酸钡的涂痕,所述扫描装置在手术前扫描获取包含有所述涂痕的扫描图像数据;
所述反面图案在贴附所述定位部件时印制在患者皮肤上形成水印图案,所 述定位摄像设备在手术中获取包含有所述水印图案的定位图像。
较佳地,所述动态追踪部件还设置在手术器械上;
所述定位摄像设备在手术中获取包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像;
所述光学动态追踪设备在手术中识别所述动态追踪部件以获取所述手术器械的第三实时动态数据;
所述处理器根据所述第一实时动态数据和所述第三实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准。
本发明的积极进步效果在于:本发明的手术导航系统通过扫描设备获取患者三维模拟图像,并依靠动态定位获取装置实时将患者和手术器械的实时动态数据叠加到三维模拟图像中,再结合AR眼镜的实时动态数据,使得在AR眼镜中呈现的图像与现实环境中的患者精准叠加,同时更直观的呈现手术器械相对患者解剖结构的空间位置图,更好的辅助医生进行手术。
附图说明
图1为本发明实施例1的基于AR技术的手术导航系统的结构框图。
图2为本发明实施例2的基于AR技术的手术导航系统的结构框图。
图3为本发明实施例3的基于AR技术的手术导航系统的结构框图。
图4为本发明实施例5的基于AR技术的手术导航系统的结构框图。
图5为本发明实施例6的基于AR技术的手术导航方法的流程图。
图6为本发明实施例6的基于AR技术的手术导航方法中生成三维导航图像的流程图。
图7为本发明实施例7的基于AR技术的手术导航方法的流程图。
图8为本发明实施例7的基于AR技术的手术导航方法中生成三维导航图像的流程图。
图9为本发明实施例8的基于AR技术的手术导航方法的流程图。
图10为本发明实施例9的基于AR技术的手术导航方法的流程图。
图11为本发明实施例10的基于AR技术的手术导航方法的流程图。
图12为本发明实施例10的基于AR技术的手术导航方法中生成三维导航 图像的流程图。
具体实施方式
下面通过实施例的方式进一步说明本发明,但并不因此将本发明限制在所述的实施例范围之中。下列实施例中未注明具体条件的实验方法,按照常规方法和条件,或按照商品说明书选择。
实施例1
一种基于AR技术的手术导航系统,具体如图1所示,包括扫描装置1、处理器2、动态定位获取装置3和AR眼镜4,其中,扫描装置1、动态定位获取装置3和AR眼镜4分别与处理器2通信连接。
扫描装置1用于在手术前对患者5进行扫描以获取扫描图像数据并发送至处理器2,本实施例中的扫描装置1可以为CT设备,患者经过CT设备扫描后会生成多张断层图像,这些断层图像即上述的扫描图像数据。
处理器2用于接收扫描图像数据,并根据扫描图像数据生成患者5的三维模拟图像,需要说明的是,生成的三维模拟图像与患者5为一比一等比例模拟,并且,通过该三维模拟图像可以观察到患者5的骨骼、血管、肌肉组织等。
另外,在患者5位于动态定位获取装置3的有效测试范围内时,动态定位获取装置3可以在手术中获取患者5在动态定位获取装置坐标系中的初始静态的定位图像和第一实时动态数据,并将这些信息发送至处理器2,处理器2通过机器视觉算法对定位图像进行处理,从定位图像中获取目标图像,并计算出目标图像在动态定位获取装置坐标系中的坐标参数。
同时,动态定位获取装置3还用于获取AR眼镜4在动态定位获取装置坐标系中的第二实时动态数据并发送至处理器2。具体地说,可以在AR眼镜上设置可被动态定位获取装置3追踪的标记部件,动态定位获取装置3通过追踪该标记部件进而获取AR眼镜4在动态定位获取装置坐标系中的第二实时动态数据。
上述提到的处理器2还用于根据患者5的坐标参数将三维模拟图像在动态定位获取装置坐标系中进行静态注册,并根据第一实时动态数据将静态注册后的三维模拟图像与患者5进行动态配准。需要解释的是,静态注册就是将处理 器2中存储的三维模拟图像放入到动态定位获取装置坐标系中,使得三维模拟图像的坐标与实际环境中患者5的坐标是相同的。动态配准就是根据获取的第一实时动态数据,使三维模拟图像和现实环境中患者5同步运动。
考虑到在手术时,佩戴AR眼镜4的医生不可避免的需要移动,为了方便医生手术观察,本实施例可以将AR眼镜4中的虚拟成像与现实环境中的患者5进行实时重合,具体如下:
处理器2根据第二实时动态数据和第一实时动态数据获取患者5在AR坐标系中的坐标数据,并结合动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像并发送至AR眼镜4,AR眼镜4用于接收并显示三维导航图像。
也就是说,以AR眼镜4为视点,并以该视点为坐标原点构建AR坐标系,实时计算出患者5在AR坐标系中的坐标数据,并将注册配准后的三维模拟图像移入到该坐标数据代表的位置。而在实际情况中,患者5在动态定位获取装置坐标系中的坐标是能够被获取的,AR眼镜4在动态定位获取装置坐标系中的坐标也是能够被获取的,不难理解,通过处理器2将患者5在动态定位获取装置坐标系中的坐标减去AR眼镜4在动态定位获取装置坐标系中的坐标,即可得到患者5在AR坐标系中的坐标,从而处理器2将注册配准后的三维模拟图像放入AR坐标系的相应坐标中,便可生成三维导航图像,并且,该三维导航图像与实际环境中的患者5是实时重合的。
在手术时,即使佩戴AR眼镜4的医生位置发生移动,处理器2也会实时根据AR眼镜4以及患者5在动态定位获取装置坐标系中的坐标,间接计算出患者5在AR坐标系中的坐标,将注册配准后的三维模拟图像放入AR坐标系中,确保医生在AR眼镜4中看到的虚拟图像与现实环境中的患者是实时重合的,使得在AR眼镜4中呈现的图像与现实环境中的患者5精准叠加,从而便于医生对患者的实体以及虚拟成像进行观察,有利于手术的开展。
实施例2
本实施例的基于AR眼镜的手术导航系统是在实施例1的基础上进一步改进,具体如图2所示,动态定位获取装置3包括光学动态追踪设备32和定位摄像设备31,比如NDI光学动态追踪设备和双目摄像机,光学动态追踪设备 32用于获取第一实时动态数据和第二实时动态数据;定位摄像设备31用于获取定位图像。处理器2还用于根据预设坐标和定位图像生成坐标参数,该坐标参数即表示患者在动态定位获取装置坐标系中的位置。处理器2根据坐标参数将将三维模拟图像在动态定位获取装置坐标系中进行静态注册。
需要说明的是,定位摄像设备31在动态定位获取装置坐标系中的坐标为一预设坐标,即定位摄像设备31在动态定位获取装置坐标系中的的位置已知,本实施例以NDI光学动态追踪设备和双目摄像机为例进行具体说明。
双目摄像机位于NDI光学动态追踪设备的有效追踪范围内,NDI光学动态追踪设备可以获取双目摄像机的坐标位置。具体实现方式是,可以通过在双目摄像机上装设能够被NDI光学动态追踪设备追踪的定位标记,该定位标记可以是有源标记也可以是无源标记,本实施例对此不作具体限定。NDI光学动态追踪设备通过追踪定位标记发射的红外光或定位标记反射的红外光,从而获取双目摄像机在动态定位获取装置坐标系下的坐标,并发送至处理器2。
值得注意的是,光学动态追踪设备32也是具有定位测距的功能,但是精度小于专业的定位摄像设备,因此,为确保手术过程中获取到的动、静态数据的精度,本实施例选用光学动态追踪设备和定位摄像设备两种设备,由光学动态追踪设备获取患者5的第一实时动态数据和AR眼镜4的第二实时动态数据,由定位摄像设备获取患者5的定位图像,其中由于定位摄像设备在动态定位获取装置坐标系中的坐标为一预设坐标,再结合定位图像即可完成静态注册。
实施例3
本实施例的基于AR眼镜的手术导航系统是在实施例2的基础上进一步改进,具体如图3所示,动态定位获取装置3还包括若干个标记部件,标记部件包括定位部件61和动态追踪部件62;定位部件61和动态追踪部件62贴附在距离患者手术部位一定预设范围内的身体上,动态追踪部件62还设置在AR眼镜4上;
扫描装置1用于在手术前获取包含有定位部件图像的扫描图像数据;
定位摄像设备31用于在手术中获取包含有定位部件图像和动态追踪部件图像的定位图像;
光学动态追踪设备32用于在手术中识别动态追踪部件以获取第一实时动态数据和第二实时动态数据.
本实施例中,通过在患者和AR眼镜上设有定位部件和动态追踪部件,定位部件能够被定位摄像设备识别,动态追踪部件能够被光学动态追踪设备识别。
实施例4
本实施例的基于AR眼镜的手术导航系统是在实施例3的基础上进一步改进,主要改进之处在于:定位部件的正面图案和反面图案相同,正面图案为硫酸钡的涂痕,扫描装置用于在手术前扫描获取包含有涂痕的扫描图像数据;反面图案在贴附定位部件时印制在患者皮肤上形成水印图案,定位摄像设备用于在手术中获取包含有水印图案的定位图像。
需要解释的是,考虑到在实际情况中,患者扫描与手术并不是同期进行,中间往往存在间隔,间隔的时间可以是三天、五天或者一周,在此期间,患者进行的生活活动,例如洗澡、睡觉等,这些行为可能会导致定位部件偏移,使得在手术时,定位摄像设备获取的含有定位部件图像的定位图像不准确,从而影响三维模拟图像在动态定位获取装置坐标系中的定位。因此,反面图案在贴附定位部件时可以印制在患者皮肤上形成水印图案,水印图案可以在患者体表保持一段时间,避免发生偏移。
本实施例中,定位部件为薄膜状,正反面设有图案且相同,图案可以是网格状、同心圆、等距点等,定位部件上设有1个或多个特殊点,用于在手术过程中进行定位参考,定位部件的正面采用硫酸钡的涂痕,硫酸钡能够被CT设备的X线显影即能够实现被CT设备识别。
实施例5
本实施例的基于AR眼镜的手术导航系统是在实施例3的基础上进一步改进,具体如图4所示,动态追踪部件还设置在手术器械7上;
定位摄像设备31用于在手术中获取包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像;
光学动态追踪设备32还用于在手术中识别动态追踪部件62以获取手术器械7的第三实时动态数据;
处理器2用于根据第一实时动态数据和第三实时动态数据将静态注册后的三维模拟图像与患者进行动态配准。
值得一提的是,手术导航系统还包括报警装置,报警装置与处理器通信连接。而三维导航图像上标记有提醒部位,该提醒部位可以是重要的血管或者神经等,处理器实时计算手术器械与提醒部位之间的最小距离,当手术器械与提醒部位之间的最小距离小于或等于预设值时,处理器控制报警装置发出提醒信号,提醒医生需小心操作,当手术器械与提醒部位之间的最小距离小于或等于预设危险值时,处理器控制报警装置发出警示信号,提醒医生当前操作风向较高,通过上述报警装置提高手术的安全系数。
本实施例中,通过在手术器械上设置标记部件,手术过程中实时获取手术器械的动态数据并纳入到三维模拟图像中,使得医生能够更加直观的呈现手术器械相对于患者解剖结构的空间位置图,更好的辅助医生进行手术。
实施例6
如图5-6所示,一种基于AR技术的手术导航方法,其特征在于,手术导航方法利用实施例1中的手术导航系统实现,包括一下步骤:
步骤101、扫描装置在手术前对患者进行扫描以获取扫描图像数据并发送至处理器;
步骤102、处理器接收扫描图像数据并生成患者的三维模拟图像;
步骤103、动态定位获取装置在手术中获取患者在动态定位获取装置坐标系中的定位图像和第一实时动态数据并发送至处理器,获取AR眼镜在动态定位获取装置坐标系中的第二实时动态数据并发送至处理器;
步骤104、处理器根据接收到的三维模拟图像、定位图像、第一实时动态数据和第二实时动态数据生成三维导航图像并发送至AR眼镜;
具体的,包括以下步骤:
步骤1041、根据定位图像将三维模拟图像在动态定位获取装置坐标系中进行静态注册;
步骤1042、根据第一实时动态数据将静态注册后的三维模拟图像与患者进行动态配准;
步骤1043、根据第二实时动态数据和第一实时动态数据获取患者在AR 眼镜坐标系中的坐标数据,并结合和动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像。
步骤105、AR眼镜接收并显示三维导航图像。
本实施例中,通过扫描设备获取患者三维模拟图像,并依靠动态定位获取装置实时的将患者的实时动态数据叠加到三维模拟图像中,再结合AR眼镜的实时动态数据,使得在AR眼镜中呈现的图像与现实环境中的患者精准叠加,更好的辅助医生进行手术。
实施例7
如图7-8所示,本实施例的基于AR眼镜的手术导航方法是在实施例6的基础上进一步改进,动态定位获取装置包括光学动态追踪设备和定位摄像设备,定位摄像设备在动态定位获取装置坐标系中的坐标为一预设坐标;
步骤103用步骤103a替换:
步骤103a、定位摄像设备在手术中获取患者在动态定位获取装置坐标系中的定位图像并发送至处理器,光学动态追踪设备获取手术中患者在动态定位获取装置坐标系中的第一实时动态数据和AR眼镜的第二实时动态数据并发送至处理器。
步骤1041用步骤1041a替换:
1041a、处理器根据预设坐标和定位图像生成坐标参数,并根据坐标参数进行静态注册。
本实施例中,为确保手术过程中获取到的动静态数据的精度,结合光学动态追踪设备和定位摄像设备两种设备,由光学动态追踪设备获取患者的第一实时动态数据和AR眼镜的第二实时动态数据,由定位摄像设备获取患者的定位图像,其中由于定位摄像设备在动态定位获取装置坐标系中的坐标为一预设坐标,再结合定位图像即可完成静态注册。
实施例8
如图9所示,本实施例的基于AR眼镜的手术导航方法是在实施例7的基础上进一步改进,动态定位获取装置还包括若干个标记部件,标记部件包括定位部件和动态追踪部件;定位部件和动态追踪部件贴附在距离患者手术部位一定预设范围内的身体上,动态追踪部件还设置在AR眼镜上;
步骤101用步骤101a替换:
步骤101a、扫描装置在手术前对患者进行扫描以获取包含有定位部件图像的扫描图像数据并发送至处理器;
步骤103a用步骤103b替换:
步骤103b、定位摄像设备在手术中获取在动态定位获取装置坐标系中的包含有定位部件图像和动态追踪部件图像的定位图像并发送至处理器,光学动态追踪设备在手术中识别动态追踪部件以获取手术中患者在动态定位获取装置坐标系中的第一实时动态数据和AR眼镜的第二实时动态数据并发送至处理器。
本实施例中,通过在患者和AR眼镜上设有定位部件和动态追踪部件,定位部件能够被定位摄像设备识别,动态追踪部件能够被光学动态追踪设备识别。
实施例9
如图10所示,本实施例的基于AR眼镜的手术导航方法是在实施例8的基础上进一步改进,定位部件的正面图案和反面图案相同;正面图案为硫酸钡的涂痕,反面图案在贴附定位部件时印制在患者皮肤上形成水印图案;
步骤101a用步骤101b替换:
步骤101b、扫描装置在手术前对患者进行扫描以获取包含有涂痕的扫描图像数据并发送至处理器;
步骤103b用步骤103c替换:
步骤103c、定位摄像设备在手术中获取在动态定位获取装置坐标系中的包含有水印图案和动态追踪部件图像的定位图像并发送至处理器,光学动态追踪设备在手术中识别动态追踪部件以获取手术中患者在动态定位获取装置坐标系中的第一实时动态数据和AR眼镜的第二实时动态数据并发送至处理器。
本实施例中,定位部件为薄膜状,正反面设有图案且相同,图案可以是网格状、同心圆、等距点等,定位部件上设有1个或多个特殊点,用于在手术过程中进行定位参考,定位部件的正面采用硫酸钡的涂痕,硫酸钡能够被CT设备的X线显影即能够实现被CT设备识别,另外,由于一般患者做完CT后,往往不能即可进行手术,则需要定位部件长时间贴在患者上,这种情况下,定 位标记部件很容易移为且不方便,通过印制的方式使得水印图案保留在患者上,水印图案可以保留一周时间。
实施例10
如图11-12所示,本实施例的基于AR眼镜的手术导航方法是在实施例8的基础上进一步改进,动态追踪部件还设置在手术器械上;
步骤103b用步骤103d替换:
步骤103d、定位摄像设备在手术中获取在动态定位获取装置坐标系中的包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像并发送至处理器,光学动态追踪设备在手术中识别动态追踪部件以获取手术中患者在动态定位获取装置坐标系中的第一实时动态数据、AR眼镜的第二实时动态数据和手术器械的第三实时动态数据并发送至处理器。
步骤1042用步骤1042b替换:
步骤1042b、根据第一实时动态数据和第三实时动态数据将静态注册后的三维模拟图像与患者进行动态配准。
本实施例中,通过在手术器械上设置标记部件,手术过程中实时获取手术器械的动态数据并纳入到三维模拟图像中,使得医生能够更加直观的呈现手术器械相对于患者解剖结构的空间位置图,更好的辅助医生进行手术。
虽然以上描述了本发明的具体实施方式,但是本领域的技术人员应当理解,这些仅是举例说明,本发明的保护范围是由所附权利要求书限定的。本领域的技术人员在不背离本发明的原理和实质的前提下,可以对这些实施方式做出多种变更或修改,但这些变更和修改均落入本发明的保护范围。

Claims (10)

  1. 一种基于AR技术的手术导航系统,其特征在于,所述手术导航系统包括扫描装置、处理器、动态定位获取装置和AR眼镜;
    所述扫描装置用于在手术前对患者进行扫描以获取扫描图像数据并发送至所述处理器;
    所述处理器用于接收所述扫描图像数据并生成所述患者的三维模拟图像;
    所述动态定位获取装置用于在手术中获取患者在动态定位获取装置坐标系中的定位图像和第一实时动态数据并发送至所述处理器,还用于获取所述AR眼镜在所述动态定位获取装置坐标系中的第二实时动态数据并发送至所述处理器;
    所述处理器还用于根据所述定位图像将所述三维模拟图像在所述动态定位获取装置坐标系中进行静态注册,并根据所述第一实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准;
    所述处理器根据所述第二实时动态数据和所述第一实时动态数据获取所述患者在AR眼镜坐标系中的坐标数据,并结合动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像并发送至所述AR眼镜;
    所述AR眼镜用于接收并显示所述三维导航图像。
  2. 如权利要求1所述的手术导航系统,其特征在于,所述动态定位获取装置包括光学动态追踪设备和定位摄像设备,所述定位摄像设备在所述动态定位获取装置坐标系中的坐标为一预设坐标;
    所述光学动态追踪设备用于获取所述第一实时动态数据和所述第二实时动态数据;
    所述定位摄像设备用于获取所述定位图像;
    所述处理器还用于根据所述预设坐标和所述定位图像生成坐标参数,并根据所述坐标参数进行所述静态注册。
  3. 如权利要求2所述的手术导航系统,其特征在于,所述动态定位获取装置还包括若干个标记部件,所述标记部件包括定位部件和动态追踪部件;所述定位部件和所述动态追踪部件贴附在距离患者手术部位一定预设范围内的身体上,所述动态追踪部件还设置在所述AR眼镜上;
    所述扫描装置用于在手术前获取包含有定位部件图像的扫描图像数据;
    所述定位摄像设备用于在手术中获取包含有定位部件图像和动态追踪部件图像的定位图像;
    所述光学动态追踪设备用于在手术中识别所述动态追踪部件以获取所述第一实时动态数据和所述第二实时动态数据。
  4. 如权利要求3所述的手术导航系统,其特征在于,所述定位部件的正面图案和反面图案相同;
    所述正面图案为硫酸钡的涂痕,所述扫描装置用于在手术前扫描获取包含有所述涂痕的扫描图像数据;
    所述反面图案在贴附所述定位部件时印制在患者皮肤上形成水印图案,所述定位摄像设备用于在手术中获取包含有所述水印图案的定位图像。
  5. 如权利要求3所述的手术导航系统,其特征在于,所述动态追踪部件还设置在手术器械上;
    所述定位摄像设备用于在手术中获取包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像;
    所述光学动态追踪设备还用于在手术中识别所述动态追踪部件以获取所述手术器械的第三实时动态数据;
    所述处理器用于根据所述第一实时动态数据和所述第三实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准。
  6. 一种基于AR技术的手术导航方法,其特征在于,所述手术导航方法利用如权利要求1所述的手术导航系统实现,包括:
    所述扫描装置在手术前对患者进行扫描以获取扫描图像数据并发送至所述处理器,所述处理器接收所述扫描图像数据并生成所述患者的三维模拟图像;
    所述动态定位获取装置在手术中获取患者在动态定位获取装置坐标系中的定位图像和第一实时动态数据并发送至所述处理器,还用于获取所述AR眼镜在所述动态定位获取装置坐标系中的第二实时动态数据并发送至所述处理器;
    所述处理器根据所述定位图像将所述三维模拟图像在所述动态定位获取 装置坐标系中进行静态注册,并根据所述第一实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准;
    所述处理器根据所述第二实时动态数据和所述第一实时动态数据获取所述患者在AR眼镜坐标系中的坐标数据,并结合动态配准后的三维模拟图像生成与实际环境中患者实时重合的三维导航图像并发送至所述AR眼镜;
    所述AR眼镜接收并显示所述三维导航图像。
  7. 如权利要求6所述的手术导航方法,其特征在于,所述动态定位获取装置包括光学动态追踪设备和定位摄像设备,所述定位摄像设备在所述动态定位获取装置坐标系中的坐标为一预设坐标;
    所述光学动态追踪设备获取所述第一实时动态数据和所述第二实时动态数据,所述定位摄像设备用于获取所述定位图像;
    所述处理器根据所述预设坐标和所述定位图像生成坐标参数,并根据所述坐标参数进行所述静态注册。
  8. 如权利要求7所述的手术导航系统,其特征在于,所述动态定位获取装置还包括若干个标记部件,所述标记部件包括定位部件和动态追踪部件;所述定位部件和所述动态追踪部件贴附在距离患者手术部位一定预设范围内的身体上,所述动态追踪部件还设置在所述AR眼镜上;
    所述扫描装置在手术前获取包含有定位部件图像的扫描图像数据;
    所述定位摄像设备在手术中获取包含有定位部件图像和动态追踪部件图像的定位图像;
    所述光学动态追踪设备在手术中识别所述动态追踪部件以获取所述第一实时动态数据和所述第二实时动态数据。
  9. 如权利要求8所述的手术导航系统,其特征在于,所述定位部件的正面图案和反面图案相同;
    所述正面图案为硫酸钡的涂痕,所述扫描装置在手术前扫描获取包含有所述涂痕的扫描图像数据;
    所述反面图案在贴附所述定位部件时印制在患者皮肤上形成水印图案,所述定位摄像设备在手术中获取包含有所述水印图案的定位图像。
  10. 如权利要求8所述的手术导航系统,其特征在于,所述动态追踪部件 还设置在手术器械上;
    所述定位摄像设备在手术中获取包含有定位部件图像、动态追踪部件图像和手术器械图像的定位图像;
    所述光学动态追踪设备在手术中识别所述动态追踪部件以获取所述手术器械的第三实时动态数据;
    所述处理器根据所述第一实时动态数据和所述第三实时动态数据将静态注册后的三维模拟图像与所述患者进行动态配准。
PCT/CN2018/099849 2017-08-21 2018-08-10 基于ar技术的手术导航系统及方法 WO2019037606A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710719544.7 2017-08-21
CN201710719544.7A CN107374729B (zh) 2017-08-21 2017-08-21 基于ar技术的手术导航系统及方法

Publications (1)

Publication Number Publication Date
WO2019037606A1 true WO2019037606A1 (zh) 2019-02-28

Family

ID=60353855

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/099849 WO2019037606A1 (zh) 2017-08-21 2018-08-10 基于ar技术的手术导航系统及方法

Country Status (2)

Country Link
CN (1) CN107374729B (zh)
WO (1) WO2019037606A1 (zh)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107374729B (zh) * 2017-08-21 2021-02-23 刘洋 基于ar技术的手术导航系统及方法
TWI741196B (zh) * 2018-06-26 2021-10-01 華宇藥品股份有限公司 整合擴增實境之手術導航方法及系統
CN108657074A (zh) * 2018-06-29 2018-10-16 姜鹏飞 一种移动车辆内用户视觉增强的方法及装置
CN109106448A (zh) * 2018-08-30 2019-01-01 上海霖晏医疗科技有限公司 一种手术导航方法和装置
CN109717957B (zh) * 2018-12-27 2021-05-11 北京维卓致远医疗科技发展有限责任公司 基于混合现实的控制系统
CN111374784B (zh) * 2018-12-29 2022-07-15 海信视像科技股份有限公司 一种增强现实ar定位系统及方法
CN109758231A (zh) * 2019-03-05 2019-05-17 钟文昭 基于混合现实的胸腔内手术导航方法及系统
CN109730771A (zh) * 2019-03-19 2019-05-10 安徽紫薇帝星数字科技有限公司 一种基于ar技术的手术导航系统
CN110215284B (zh) * 2019-06-06 2021-04-02 上海木木聚枞机器人科技有限公司 一种可视化系统和方法
TWI741536B (zh) * 2020-03-20 2021-10-01 台灣骨王生技股份有限公司 基於混合實境的手術導航影像成像方法
CN111588999B (zh) * 2020-05-25 2022-07-08 李硕 手术导向模型、头戴可穿戴设备辅助手术导航系统
CN111973273A (zh) * 2020-08-31 2020-11-24 上海交通大学医学院附属第九人民医院 基于ar技术的手术导航系统、方法、设备和介质
CN112155727A (zh) * 2020-08-31 2021-01-01 上海市第一人民医院 基于三维模型的手术导航系统、方法、设备和介质
CN112190331A (zh) * 2020-10-15 2021-01-08 北京爱康宜诚医疗器材有限公司 手术导航信息的确定方法、装置及系统、电子装置
CN113081273B (zh) * 2021-03-24 2023-07-28 上海微创医疗机器人(集团)股份有限公司 打孔辅助系统及手术机器人系统
CN114587658A (zh) * 2022-02-06 2022-06-07 上海诠视传感技术有限公司 Ar眼镜识别口腔种植手机在空间坐标系中位置的方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070211927A1 (en) * 2006-03-09 2007-09-13 General Electric Company Methods and systems for registration of surgical navigation data and image data
CN103211655A (zh) * 2013-04-11 2013-07-24 深圳先进技术研究院 一种骨科手术导航系统及导航方法
CN104434313A (zh) * 2013-09-23 2015-03-25 中国科学院深圳先进技术研究院 一种腹部外科手术导航方法及系统
CN105852970A (zh) * 2016-04-29 2016-08-17 北京柏惠维康科技有限公司 神经外科机器人导航定位系统及方法
CN107374729A (zh) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 基于ar技术的手术导航系统及方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI517828B (zh) * 2012-06-27 2016-01-21 國立交通大學 影像追蹤系統及其影像追蹤方法
CN106859767A (zh) * 2017-03-29 2017-06-20 上海霖晏网络科技有限公司 一种手术导航方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070211927A1 (en) * 2006-03-09 2007-09-13 General Electric Company Methods and systems for registration of surgical navigation data and image data
CN103211655A (zh) * 2013-04-11 2013-07-24 深圳先进技术研究院 一种骨科手术导航系统及导航方法
CN104434313A (zh) * 2013-09-23 2015-03-25 中国科学院深圳先进技术研究院 一种腹部外科手术导航方法及系统
CN105852970A (zh) * 2016-04-29 2016-08-17 北京柏惠维康科技有限公司 神经外科机器人导航定位系统及方法
CN107374729A (zh) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 基于ar技术的手术导航系统及方法

Also Published As

Publication number Publication date
CN107374729B (zh) 2021-02-23
CN107374729A (zh) 2017-11-24

Similar Documents

Publication Publication Date Title
WO2019037606A1 (zh) 基于ar技术的手术导航系统及方法
CN107440797B (zh) 用于手术导航的注册配准系统及方法
US11426241B2 (en) Device for intraoperative image-controlled navigation during surgical procedures in the region of the spinal column and in the adjacent regions of the thorax, pelvis or head
US7010095B2 (en) Apparatus for determining a coordinate transformation
US20200315734A1 (en) Surgical Enhanced Visualization System and Method of Use
WO2011122032A1 (ja) 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム
US20030114741A1 (en) Projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images
JP5569711B2 (ja) 手術支援システム
WO2007115825A1 (en) Registration-free augmentation device and method
US20210052329A1 (en) Monitoring of moving objects in an operation room
CN112155727A (zh) 基于三维模型的手术导航系统、方法、设备和介质
CN109674533B (zh) 基于便携式彩超设备的手术导航系统及方法
JP2008018015A (ja) 医用ディスプレイ装置及び医用ディスプレイシステム
WO2019128961A1 (zh) 用于手术导航的配准系统及方法
WO2014050019A1 (ja) 仮想内視鏡画像生成装置および方法並びにプログラム
CN109730771A (zh) 一种基于ar技术的手术导航系统
CN110720985A (zh) 一种多模式引导的手术导航方法和系统
KR20190058190A (ko) 증강현실 기술을 이용한 척추 수술 네비게이션 시스템 및 방법
CN111973273A (zh) 基于ar技术的手术导航系统、方法、设备和介质
Zhang et al. 3D augmented reality based orthopaedic interventions
WO2015091226A1 (en) Laparoscopic view extended with x-ray vision
Leventon A registration, tracking, and visualization system for image-guided surgery
CN111053598A (zh) 一种基于投影仪的增强现实系统平台
Wengert et al. Endoscopic navigation for minimally invasive suturing
KR20230059730A (ko) 이미지 등록을 위한 시스템 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18848388

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18848388

Country of ref document: EP

Kind code of ref document: A1