CN115317129A - AR navigation system and method for hip arthroscopy operation - Google Patents

AR navigation system and method for hip arthroscopy operation Download PDF

Info

Publication number
CN115317129A
CN115317129A CN202210886790.2A CN202210886790A CN115317129A CN 115317129 A CN115317129 A CN 115317129A CN 202210886790 A CN202210886790 A CN 202210886790A CN 115317129 A CN115317129 A CN 115317129A
Authority
CN
China
Prior art keywords
image
virtual
module
dimensional model
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210886790.2A
Other languages
Chinese (zh)
Inventor
陈疾忤
宋春凤
黄洪波
刘洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai First Peoples Hospital
Original Assignee
Shanghai First Peoples Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai First Peoples Hospital filed Critical Shanghai First Peoples Hospital
Priority to CN202210886790.2A priority Critical patent/CN115317129A/en
Publication of CN115317129A publication Critical patent/CN115317129A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The invention discloses an AR navigation system and method for hip arthroscopy operation. The AR navigation system includes: the three-dimensional modeling module acquires preoperative CT scanning data and images of a patient through the first scanning device, and marks a femoral head spherical surface on the preoperative CT images according to a spherical fitting technology of computer image analysis, so that cam deformity protruding out of the spherical femoral head and neck is outlined. Simultaneously generating a virtual three-dimensional model and sending the virtual three-dimensional model to a planning module; the planning module is used for receiving the virtual three-dimensional model, identifying osteophytes, planning the surgical resection range and the surgical process of the cam deformity on the virtual three-dimensional model, and sending the planned surgical process to the registration module; and the registration module identifies the characteristic points of the X-ray image through the X-ray image shot by the second scanning device in the operation process, and registers the characteristic points with the virtual three-dimensional model containing the planning operation process in the same coordinate system to form a three-dimensional navigation image.

Description

AR navigation system and method for hip arthroscopic surgery
Technical Field
The invention relates to the field of AR, in particular to an AR navigation system and method for hip arthroscopy operation.
Background
The technical difficulties existing in the current hip arthroscopy operation are concentrated in three aspects, firstly, the hip joint access is difficult to establish. After the hip joint clearance reaches 8-10mm through traction, a surgeon firstly needs to puncture a puncture needle into the hip joint to establish a first outer side observation access of a hip arthroscope, but the puncture needle cannot be accurately punctured by simple body surface marking positioning like other large joints of a body due to the deep position of the hip joint; furthermore, the blood vessels around the hip joint are rich in nerves, and an inexperienced doctor may cause injury due to wrong puncture. In addition, it is more difficult to establish a working access for hip arthroscopy operation, and after an arthroscope is placed into a joint through a first access by an operator, because of a deep hip joint gap, it is very difficult to accurately insert a puncture needle into the arthroscope view field in the joint through another body surface position, and the operation is often the most important speed-limiting step in the hip arthroscopy operation. Second, intraoperative positioning is difficult. Because the hip joint belongs to an acetabular joint, an annular acetabular edge and a spherical femoral head are often lack of obvious positioning marks under an arthroscope, and accurate spatial positioning of an anatomical structure under the arthroscope is difficult in an operation; third, the surgical field is difficult to control accurately. For example, the most common femoral hip impingement surgery under hip arthroscopy requires the corresponding osteophyte to be abraded under the hip arthroscopy, so the range of the osteophyte is often determined clinically through the analysis of three-dimensional CT images; however, the osteophyte boundaries on the three-dimensional CT images cannot be accurately fitted to the sub-scope field, even with open surgery. Incomplete or excessive excision of osteophytes is the most important factor for poor curative effect after femoral hip impaction and is a common operation deviation in related operations. Therefore, many doctors need to repeatedly confirm the operation by means of X-ray C-arm fluoroscopy in various positions. How to guide the operation of arthroscopic surgery in real time on the result of three-dimensional CT imaging analysis is a difficult point of clinical application. The factors of difficult positioning in the hip joint operation, X-ray fluoroscopy in the repeated operation and the like prolong the operation time, bring radiation damage to the doctor and the patient and cause operation difficulty at the same time.
Disclosure of Invention
The invention aims to provide a system and a method capable of guiding arthroscopic surgical operation in real time according to the result of three-dimensional CT (computed tomography) imaging analysis.
In order to achieve the above object, the present invention provides an AR navigation system for hip arthroscopy, comprising:
the three-dimensional modeling module is used for acquiring preoperative hip joint CT scanning data of a patient through a first scanning device and generating a hip joint virtual three-dimensional model; marking the spherical surface of the femoral head according to a spherical fitting technology, and delineating the cam deformity protruding out of the spherical surface at the femoral head and neck;
the planning module is used for receiving the virtual three-dimensional model, identifying osteophytes, forming a planning operation process on the virtual three-dimensional model and sending the planning operation process to the registration module;
the registration module is used for identifying the characteristic points of the X-ray image through the X-ray image shot by the second scanning device in the operation, and registering the characteristic points with the virtual three-dimensional model containing the planning operation process in the same coordinate system to form a three-dimensional navigation image;
and the AR glasses receive and display the three-dimensional navigation image.
Optionally, performing a spherical fit on the femoral head according to CT scan data by a gaussian algorithm, including:
converting the image into a single-channel gray image; selecting a scale space factor sigma, and respectively smoothing the image by using Gaussian operators of space factors k & ltsigma & gt of different scales, wherein the range of k is 1.0-5.0;
g 1 (x,y)=G σ1 (x,y)×I(x,y)
g 2 (x,y)=G σ2 (x,y)×I(x,y)
DoG=G σ1 -G σ2
the difference image after gaussian smoothing is:
g 1 (x,y)-g 2 (x,y)=×G σ1 -G σ2 )×I(x,y)=DoG×I(x,y)
calculating difference images of the Gaussian smooth images under different scale space factors, and performing morphological processing; assume that for image I (x, y), a gaussian smoothing with a scale σ is used:
Figure BDA0003766036480000021
the circle in the image is obtained by processing the image using the findContours function in opencv and the features of the circle.
Optionally, the second scanning device includes a C-arm machine, and the C-arm machine is used to match a positioning target block for shooting, acquiring the X-ray image, and registering the virtual three-dimensional model with a posture change in the X-ray image during operation.
Optionally, comprising: and the calibration module identifies the surgical instruments through the optical positioning and tracking equipment and selects the corresponding surgical instruments to complete registration.
Optionally, comprising: and the monitoring module monitors the actual operation process of the operation through the optical positioning and tracking equipment and sends out an early warning signal when the operation process is abnormal.
Optionally, the monitoring module acquires a three-dimensional navigation image of the registration module, and when an actual surgical operation process deviates from a planned surgical operation process in the three-dimensional navigation image, the monitoring module sends an early warning signal.
Optionally, when the monitoring module cannot confirm the current position of the arthroscope, or the optical tracking device is shielded, or the reflective ball on the surgical instrument is shielded, the monitoring module sends out an early warning signal.
Optionally, comprising: a signal module that sends real-time information to the AR glasses.
The invention also provides an AR navigation method for the hip arthroscopy operation, and an AR navigation system using the hip arthroscopy operation comprises the following steps:
the first scanning device acquires preoperative hip joint CT scanning data of a patient, the three-dimensional modeling module generates a virtual three-dimensional model according to the preoperative CT scanning data, and the virtual three-dimensional model is sent to a planning module; marking the spherical surface of the femoral head according to a spherical fitting technology, and delineating the cam deformity at the femoral head and neck which protrudes out of the spherical surface;
the planning module receives the virtual three-dimensional model, identifies osteophytes, forms a planned surgical operation process on the virtual three-dimensional model, and sends the planned surgical operation process to a registration module;
the registration module identifies characteristic points of the X-ray image through an X-ray image shot by a second scanning device in an operation, and registers the characteristic points with a virtual three-dimensional model containing the planning operation process in the same coordinate system to form a three-dimensional navigation image;
and the AR glasses receive and display the three-dimensional navigation image.
Optionally, performing a spherical fit on the femoral head according to CT scan data by a gaussian algorithm, including:
converting the image into a single-channel gray image; selecting a scale space factor sigma, and respectively performing smoothing operation on the image by using Gaussian operators of different scale space factors sigma and k & ltsigma >
define the DoG operator: define the DoG operator:
g 1 (x,y)=G σ1 (x,y)×I(x,y)
g 2 (x,y)=G σ2 (x,y)×I(x,y)
DoG=G σ1 -G σ2
the difference image after gaussian smoothing is:
g 1 (x,y)-g 2 (x,y)=(G σ1 -G σ2 )×I(x,y)=DoG×I(x,y)
calculating difference images of the Gaussian smooth images under different scale space factors, and performing morphological processing; assume that for image I (x, y), a gaussian smoothing with a scale σ is used:
Figure BDA0003766036480000041
the circle in the image is obtained by processing the image by using a findContours function in opencv and the characteristics of the circle.
The invention has the beneficial effects that:
(1) The AR navigation system for the hip arthroscopic surgery establishes the hip surgery approach, so that the surgery operation is more accurate, and the success rate of the surgery is improved.
(2) The femoral head spherical surface is used as an intraoperative positioning mark, the osteophyte boundary is fitted with an under-mirror visual field, incomplete or excessive excision of osteophyte is avoided, and the operation range is accurately controlled.
Drawings
FIG. 1 is a single-channel grayscale image of a CT image transformed by the navigation system of the present invention.
Fig. 2 is a femoral head spherical image positioned by the navigation system of the present invention.
Fig. 3 is a schematic diagram of the present invention identifying osteophytes from a virtual three-dimensional model.
Fig. 4 is another schematic view of the present invention identifying osteophytes from a virtual three-dimensional model.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected" and "connected" are to be construed broadly, e.g., as being fixed or detachable or integrally connected; may be a mechanical connection; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The invention provides an AR navigation system and method for treating Cam type (Cam type) percussion disease at the head and neck of femur by hip arthroscope operation, comprising the following steps: the system comprises a three-dimensional modeling module, a planning module, a calibration module, a registration module, a monitoring module, a signal module and AR glasses.
The three-dimensional modeling module is used for carrying out virtual three-dimensional modeling on the CT data. The three-dimensional modeling module obtains CT data of a patient after scanning through a first scanning device, such as a CT device, and the CT data can be a CT image; generating a virtual three-dimensional model according to the preoperative CT data, and sending the virtual three-dimensional model to a planning module; according to CT scanning data, through a Gaussian algorithm and by utilizing a spherical fitting technology, spherical fitting is carried out on the femoral head, the spherical surface of the femoral head is marked, and the cam deformity protruding out of the spherical femoral head and neck is outlined.
The planning module is used for preoperatively planning surgical access and operation. A planning module receives the virtual three-dimensional model; and identifying whether osteophytes influencing the rotation of a femoral head sphere exist on an acetabular lip and a lunar surface, forming a planning operation process on the virtual three-dimensional model, wherein the planning operation process comprises displaying arrows and the like used for indicating the operation path direction in the virtual three-dimensional model, and sending the planning operation process to a registration module.
The calibration module is used for identifying and registering surgical instruments. The calibration module identifies surgical instruments through the optical positioning and tracking device, and selects corresponding surgical instruments to complete registration. Specifically, the small reflective balls for tracking and positioning are closely attached to key points of the surgical instrument in a certain mode, before the operation, the small reflective balls are identified through optical positioning and tracking equipment, the corresponding surgical instrument is registered, and meanwhile, the relative spatial relationship between the small reflective balls and the surgical instrument is obtained. If the position of the reflective small ball attached to the key point of the surgical instrument is changed, the surgical instrument is identified and registered again.
The registration module is used for registering the preoperative CT virtual three-dimensional model with intraoperative body position change of an intraoperative X-ray image. The registration module performs shooting through a second scanning device, such as a C-arm machine, by using the C-arm machine to match a positioning target block (a module tool which is placed in an imaging area when an X-ray image is shot), shooting the X-ray image in the operation, identifying characteristic points of the X-ray image, and registering the posture change in the X-ray image in the operation with a virtual three-dimensional model containing the planned operation process under the same coordinate system to form a three-dimensional navigation image.
The monitoring module is used for real-time position monitoring and early warning in the operation. The monitoring module monitors the actual operation process of the operation through the optical positioning and tracking device, and sends out an early warning signal when the operation process is abnormal. Specifically, the optical positioning and tracking device acquires the operation progress condition by identifying the reflective small balls on the registered surgical instruments. During the operation of the monitoring module, the optical tracking device emits infrared light with a specific frequency band at a frequency of about 60 times per second, the infrared light is reflected by the small reflective ball, the real-time position of the small reflective ball is calculated, and then the coordinates of the key points of the surgical instruments are positioned according to the relative spatial position relation between the small reflective ball and the surgical instruments obtained in the calibration and registration process.
When the following conditions occur, the operation process is considered to be abnormal, and the monitoring module sends out an early warning signal:
case 1: the monitoring module acquires a three-dimensional navigation image of the registration module, and when the actual operation process deviates from the planned operation process in the three-dimensional navigation image, the monitoring module sends out an early warning signal;
case 2: when the monitoring module cannot confirm the current position of the arthroscope, an early warning signal is sent out;
case 3: when the optical tracking device is shielded or the small reflective balls on the surgical instrument are shielded, the optical tracking device cannot calculate the real-time positioning information of the small reflective balls, and the monitoring module sends out an early warning signal.
The signal module is used for sending real-time information to the AR glasses, and the real-time information at least comprises latest operation planning data, real-time coordinate information of the light reflecting balls and the like. And the early warning signal sent by the monitoring module is also displayed in the AR glasses. Optionally, the signal module provides a wireless network signal for the AR navigation system of the present invention.
The AR glasses are used for receiving and displaying the three-dimensional navigation image. The invention also comprises a host and a display. The host computer is connected with the optical tracking device in a wired mode and can transmit information obtained by the optical tracking device to the host computer. Optionally, various information generated by the AR navigation system, including but not limited to a planned surgical procedure, a three-dimensional navigation image, an actual surgical procedure, an early warning signal, etc., can be sent to the host, processed by the main software of the host, and displayed on the display. Various information received by the host computer is sent to the AR glasses through the signal module, and after the information is calculated by an independent micro-computing platform carried by the AR glasses, the latest operation planning data, the real-time coordinate information of the small reflective balls and other information are displayed in the AR glasses, so that an operator can watch the information conveniently. That is, the host and the AR glasses share data through a network communication manner, and the host and the AR glasses can use the data independently and display corresponding contents on their corresponding display devices.
Based on the AR navigation system for the hip arthroscopy operation, the invention provides an AR navigation method for the hip arthroscopy operation, which comprises the following steps:
the method comprises the following steps: and transmitting the CT image obtained by the CT equipment to a three-dimensional modeling module before operation through a network or a USB flash disk.
Step two: the three-dimensional modeling module renders a plurality of CT images, calculates a femoral head spherical surface by using a Gaussian algorithm and generates a virtual three-dimensional model. The femoral head spherical surface identification algorithm of the Gaussian difference operator comprises the following steps:
step (1), converting a CT image into a single-channel gray image as shown in figure 1;
step (2), selecting a scale space factor sigma, and respectively performing smoothing operation on the image by using Gaussian operators of different scale space factors k & ltsigma > k is a constant, the value range of k is 1.0-5.0, and a plurality of space factors with different scales can be selected according to different characteristic images and the fuzzy degree to be reached.
Defining a DoG operator:
g 1 (x,y)=G σ1 (x,y)×I(x,y)
g 2 (x,y)=G σ2 (x,y)×I(x,y)
DoG=G σ1 -G σ2
the difference image after gaussian smoothing is:
g 1 (x,y)-g 2 (x,y)=(G σ1 -G σ2 )×I(x,y)=DoG×I(x,y)
step (3), calculating difference images of the Gaussian smooth images under different scale space factors, and performing morphological processing; assume that for image I (x, y), a gaussian smoothing with a scale σ is used:
Figure BDA0003766036480000071
and (4) processing the image obtained in the step (3) by using a findContours function in opencv and the characteristics of the circle to obtain the femoral head spherical surface shown in fig. 2.
Step three: the planning module receives the virtual three-dimensional model, as shown in fig. 3 and 4, identifies whether the acetabular labrum and the lunar surface have osteophytes which affect the rotation of the femoral head sphere through the virtual three-dimensional model, colors and marks the focus, simulates the operation process, and forms a planning operation process on the virtual three-dimensional model.
Step four: identifying the reflective small ball on the surgical instrument through the optical positioning and tracking equipment, and selecting the corresponding surgical instrument in the navigation system to complete registration; it should be noted that the hip arthroscope is carried out in a water environment, and the reflective ball on the surgical instrument needs to be subjected to waterproof treatment, so that the reflective ball is prevented from being unrecognizable by the optical positioning and tracking device when meeting water.
Step five: and sending the planned operation process to a registration module, using the C-arm machine to match with a positioning target block for shooting, shooting an X-ray image in the operation, identifying the characteristic point of the X-ray image, and registering the posture change in the X-ray image in the operation and a virtual three-dimensional model containing the planned operation process in the same coordinate system to form a three-dimensional navigation image.
Step six: the patient lies on the orthopaedics traction bed with the perineum part against the perineum column with the diameter of 20-30 mm. The ankle areas on both sides are wrapped and protected and then fixed on the ankle parts of the traction bed. After the two sides are pulled to straighten and abduct the two lower limbs by 45 degrees and have certain tension, the lower limbs on the operation side are mainly pulled through a traction device of the traction table, the X-ray machine in the operation is used for shooting, and whether the gap between the acetabulum and the femoral head meets the requirement of 8-10mm or not is observed through X-ray images. According to the navigation planning, a doctor makes an incision at the highest point of the front edge of the upper part of the greater trochanter of the femur, and a puncture needle provided with a small reflecting ball is punctured into the joint gap according to a planned path. After the puncture needle inner core is taken out, the long guide wire is inserted into the puncture needle sheath until the joint; and then, after the arthroscope sheath is placed along the long guide wire, the arthroscope provided with the small reflecting balls is connected, and then the primary observation of the hip arthroscope can be carried out. The front joint capsule at the triangular space formed by the upper labrum on the front of the acetabulum and the femoral head is observed by moving the lens of the arthroscope.
Step seven: according to the planned operation process formed by a navigation system, a puncture needle is positioned in a safe area at the upper end of a thigh and is punctured into a hip joint gap so as to ensure that the tip of the puncture needle enters the view of an arthroscope, and then a guide wire, an arthroscope sheath and the like are sequentially arranged to complete the establishment of a working access.
Step eight: the optical positioning tracking device is used for monitoring the operation process in real time, the display is used for displaying the operation path and the direction arrow, and when the arthroscope deviates from the planned path or the operation of a doctor is wrong, the system sends out an early warning signal. At least when the following conditions exist, an early warning signal is sent out:
in the case 1, in the actual surgical operation process, the surgical approach deviates from an approach path pre-established in the planning surgical operation process;
case 2, when the arthroscope enters the joint, the current position of the arthroscope cannot be confirmed due to the special environment in the joint;
case 3, occlusion of the optical position-tracking device or the reflective beads.
Step nine: and transmitting the first step to the fifth step to the head-wearing AR glasses through a signal module to realize visualization.
In summary, the present invention discloses an AR navigation system for hip arthroscopy surgery, comprising: the third step comprises: the system comprises a three-dimensional modeling module, a planning module, a calibration module, a registration module, a monitoring module, a signal module, AR glasses and the like. The AR navigation system for the hip arthroscopic surgery establishes the hip surgery approach, so that the surgery operation is more accurate, and the success rate of the surgery is improved. The femoral head spherical surface is used as an intraoperative positioning mark, the osteophyte boundary is fitted with an under-mirror visual field, incomplete or excessive excision of osteophyte is avoided, and the operation range is accurately controlled.
While the present invention has been described in detail with reference to the preferred embodiments, it should be understood that the above description should not be taken as limiting the invention. Various modifications and alterations to this invention will become apparent to those skilled in the art upon reading the foregoing description. Accordingly, the scope of the invention should be determined from the following claims.

Claims (10)

1. An AR navigation system for hip arthroscopic surgery, comprising:
the three-dimensional modeling module is used for acquiring preoperative hip joint CT scanning data of a patient through a first scanning device and generating a hip joint virtual three-dimensional model; marking the spherical surface of the femoral head according to a spherical fitting technology, and delineating the cam deformity protruding out of the spherical surface at the femoral head and neck;
the planning module is used for receiving the virtual three-dimensional model, identifying osteophytes, forming a planning operation process on the virtual three-dimensional model and sending the planning operation process to the registration module;
the registration module is used for identifying the characteristic points of the X-ray image through the X-ray image shot by the second scanning device in the operation, and registering the characteristic points with the virtual three-dimensional model containing the planning operation process in the same coordinate system to form a three-dimensional navigation image;
and the AR glasses receive and display the three-dimensional navigation image.
2. The AR navigation system for hip arthroscopic surgery of claim 1 wherein the spherical fitting of the femoral head by gaussian algorithm based on CT scan data comprises:
converting the image into a single-channel gray image; selecting a scale space factor sigma, and respectively smoothing the image by using Gaussian operators of space factors k & ltsigma & gt of different scales, wherein the range of k is 1.0-5.0;
define the DoG operator:
g 1 (x,y)=G σ1 (x,y)×I(x,y)
g 2 (x,y)=G σ2 (x,y)×I(x,y)
DoG=G σ1 -G σ2
the difference image after gaussian smoothing is:
g 1 (x,y)-g 2 (x,y)=(G σ1 -G σ2 )×I(x,y)=DoG×I(x,y)
calculating difference images of the Gaussian smooth images under different scale space factors, and performing morphological processing; assume that for image I (x, y), a gaussian smoothing with a scale σ is used:
Figure FDA0003766036470000011
the circle in the image is obtained by processing the image by using a findContours function in opencv and the characteristics of the circle.
3. The AR navigation system for hip arthroscopic surgery of claim 1 wherein said second scanning device comprises a C-arm machine, said C-arm machine being used to capture with a positioning target mass, acquire said X-ray images, register said virtual three-dimensional model with body position changes in said intra-operative X-ray images.
4. The hip arthroscopic surgical AR navigation system of claim 1, comprising: and the calibration module identifies the surgical instruments through the optical positioning and tracking equipment and selects the corresponding surgical instruments to complete registration.
5. The hip arthroscopic surgical AR navigation system of claim 1, comprising: and the monitoring module monitors the actual operation process of the operation through the optical positioning and tracking equipment and sends out an early warning signal when the operation process is abnormal.
6. The AR navigation system for hip arthroscopic surgery of claim 5 wherein said monitoring module obtains a three-dimensional navigation image of said registration module, said monitoring module emitting an early warning signal when an actual surgical procedure deviates from a planned surgical procedure in said three-dimensional navigation image.
7. The AR navigation system for hip arthroscopic surgery of claim 5 wherein the monitoring module sends out an early warning signal when the current position of the arthroscope is not confirmed, or the optical tracking device is occluded, or a reflective bead on the surgical instrument is occluded.
8. The AR navigation system for hip arthroscopic surgery of claim 1 comprising: and the signal module sends real-time information to the AR glasses.
9. An AR navigation method for hip arthroscopy, characterized in that, using the AR navigation system for hip arthroscopy of claim 1, comprising:
the first scanning device acquires preoperative hip joint CT scanning data of a patient, the three-dimensional modeling module generates a virtual three-dimensional model according to the preoperative CT scanning data, and the virtual three-dimensional model is sent to a planning module; marking the spherical surface of the femoral head according to a spherical fitting technology, and delineating the cam deformity protruding out of the spherical surface at the femoral head and neck;
the planning module receives the virtual three-dimensional model, identifies osteophytes, forms a planning operation process on the virtual three-dimensional model, and sends the planning operation process to a registration module;
the registration module identifies characteristic points of the X-ray image through an X-ray image shot by a second scanning device in an operation, and registers the characteristic points with a virtual three-dimensional model containing the planning operation process in the same coordinate system to form a three-dimensional navigation image;
and the AR glasses receive and display the three-dimensional navigation image.
10. The AR navigation method for hip arthroscopic surgery of claim 9 wherein the spherical fitting of the femoral head by gaussian algorithm based on CT scan data comprises:
converting the image into a single-channel gray image; selecting a scale space factor sigma, and respectively performing smoothing operation on the image by using Gaussian operators of different scale space factors sigma and k & ltsigma >
define the DoG operator:
g 1 (x,y)=G σ1 (x,y)×I(x,y)
g 2 (x,y)=G σ2 (x,y)×I(x,y)
DoG=G σ1 -G σ2
the difference image after gaussian smoothing is:
g 1 (x,y)-g 2 (x,y)=(G σ1 -G σ2 )×I(x,y)=DoG×I(x,y)
calculating difference images of the Gaussian smooth images under different scale space factors, and performing morphological processing; assume that for image I (x, y), a gaussian smoothing with a scale σ is used:
Figure FDA0003766036470000031
the circle in the image is obtained by processing the image using the findContours function in opencv and the features of the circle.
CN202210886790.2A 2022-07-26 2022-07-26 AR navigation system and method for hip arthroscopy operation Pending CN115317129A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210886790.2A CN115317129A (en) 2022-07-26 2022-07-26 AR navigation system and method for hip arthroscopy operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210886790.2A CN115317129A (en) 2022-07-26 2022-07-26 AR navigation system and method for hip arthroscopy operation

Publications (1)

Publication Number Publication Date
CN115317129A true CN115317129A (en) 2022-11-11

Family

ID=83919835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210886790.2A Pending CN115317129A (en) 2022-07-26 2022-07-26 AR navigation system and method for hip arthroscopy operation

Country Status (1)

Country Link
CN (1) CN115317129A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117064550A (en) * 2023-08-31 2023-11-17 中日友好医院(中日友好临床医学研究所) Virtual simulation system and device for treating femoral head necrosis by external shock waves
CN117064550B (en) * 2023-08-31 2024-04-30 中日友好医院(中日友好临床医学研究所) Virtual simulation system and device for treating femoral head necrosis by external shock waves

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117064550A (en) * 2023-08-31 2023-11-17 中日友好医院(中日友好临床医学研究所) Virtual simulation system and device for treating femoral head necrosis by external shock waves
CN117064550B (en) * 2023-08-31 2024-04-30 中日友好医院(中日友好临床医学研究所) Virtual simulation system and device for treating femoral head necrosis by external shock waves

Similar Documents

Publication Publication Date Title
WO2022126828A1 (en) Navigation system and method for joint replacement surgery
US10398514B2 (en) Systems and methods for sensory augmentation in medical procedures
US11612402B2 (en) Method and apparatus for treating a joint, including the treatment of cam-type femoroacetabular impingement in a hip joint and pincer-type femoroacetabular impingement in a hip joint
US11701182B2 (en) Systems and methods for determining a joint center of rotation during a procedure
US20200038112A1 (en) Method for augmenting a surgical field with virtual guidance content
EP3273854B1 (en) Systems for computer-aided surgery using intra-operative video acquired by a free moving camera
US20210121237A1 (en) Systems and methods for augmented reality display in navigated surgeries
US9320421B2 (en) Method of determination of access areas from 3D patient images
AU2022204673A1 (en) Systems and methods for sensory augmentation in medical procedures
US8109942B2 (en) Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
US20050203384A1 (en) Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
US20190076195A1 (en) Articulating laser incision indication system
US11957418B2 (en) Systems and methods for pre-operative visualization of a joint
US20210259774A1 (en) Systems and methods for visually guiding bone removal during a surgical procedure on a joint
JP2023513692A (en) Systems and methods for sensory augmentation in medical procedures
JP7331223B2 (en) Fluoroscopic robot artificial implantation system and method
CN115317129A (en) AR navigation system and method for hip arthroscopy operation
Thabit et al. Augmented reality guidance for rib fracture surgery: a feasibility study
Stindel et al. Bone morphing: 3D reconstruction without pre-or intra-operative imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination