CN117462252A - Medical navigation equipment and method - Google Patents

Medical navigation equipment and method Download PDF

Info

Publication number
CN117462252A
CN117462252A CN202210867988.6A CN202210867988A CN117462252A CN 117462252 A CN117462252 A CN 117462252A CN 202210867988 A CN202210867988 A CN 202210867988A CN 117462252 A CN117462252 A CN 117462252A
Authority
CN
China
Prior art keywords
unit
optical signal
information
coordinate system
surgical tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210867988.6A
Other languages
Chinese (zh)
Inventor
郭楚
徐晓龙
何智圣
张柳云
陈德方
刘梦星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Mindray Technology Co Ltd
Original Assignee
Wuhan Mindray Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Mindray Technology Co Ltd filed Critical Wuhan Mindray Technology Co Ltd
Priority to CN202210867988.6A priority Critical patent/CN117462252A/en
Publication of CN117462252A publication Critical patent/CN117462252A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Surgical Instruments (AREA)

Abstract

The invention discloses medical navigation equipment and a method. The medical navigation device includes: the device comprises a first reflecting unit, a tracking unit, a processing unit and a prompting unit; the first reflecting unit is connected with the orthopedic operation tool and is used for reflecting the first optical signal to form a second optical signal; the tracking unit is used for transmitting the first optical signal to the first reflecting unit, receiving the second optical signal fed back by the first reflecting unit and transmitting the second optical signal to the processing unit; the processing unit is used for determining operation guiding information according to the second optical signal and the positioning information of the operation part; the prompting unit is used for prompting the operation guiding information. According to the invention, a doctor can obtain operation guiding information through the prompting unit, so that the doctor can accurately position the expected insertion position (bone marrow nail feeding point) on the operation part and accurately control the expected insertion direction (nail feeding direction), the operation times are reduced, the operation time is shortened, and the operation quality and efficiency are improved.

Description

Medical navigation equipment and method
Technical Field
The invention relates to the technical field of medicine, in particular to a medical navigation system and a medical navigation method.
Background
Proximal femur fractures, such as intertrochanteric fractures and fractures of the diaphysis of long bones such as tibial humerus, are often treated by means of intramedullary nail internal fixation, which can facilitate fracture healing. In the implementation process of fracture surgery, the selection of the position and the direction of the nail feeding point of the intramedullary nail is important, and the surgery effect and the postoperative recovery can be affected.
In order to better select the feeding point, the prior art adopts a scheme of generally confirming the feeding point and the feeding direction by driving a guide pin. Before driving the guide needle, the doctor holds the guide needle and pushes the tip of the guide needle against the selected nail feeding point, and the position of the needle point of the guide needle and the direction of the guide needle are checked on an image by utilizing C-arm X-ray perspective, but the method can certainly increase the radiation amount received by the doctor. Moreover, it is difficult for a doctor to reach a desired feeding point and feeding direction by driving the guide pin only once according to subjective experience. The adjustment process carried out according to the perspective result in the later stage also greatly prolongs the operation time, and simultaneously increases the perspective times and the radiation quantity of the patient and the doctor.
Disclosure of Invention
In order to solve the problems and the defects of the prior art, the invention aims to provide a medical navigation system and a medical navigation method which can assist doctors to quickly and accurately position the position of a nail feeding point and the direction of a nail feeding.
To achieve the above object, the present invention firstly provides a medical navigation device for intramedullary nail feeding point navigation, comprising:
a first reflection unit connected with the orthopedic operation tool and used for reflecting the first optical signal to form a second optical signal;
the tracking unit is used for transmitting the first optical signal to the first reflecting unit, receiving the second optical signal fed back by the first reflecting unit and transmitting the second optical signal to the processing unit;
a processing unit for determining surgical guidance information based on the second optical signal and the positioning information of the surgical site; the surgical guidance information is used to assist in guiding the orthopaedic surgical tool to a desired insertion location on the surgical site and/or to assist in guiding the orthopaedic surgical tool to a desired insertion direction; the intended insertion location includes an intramedullary nail insertion point;
and the prompting unit is used for prompting the operation guiding information.
The invention also provides medical navigation equipment for assisting in guiding an orthopedic operation tool, which comprises the following components:
a first reflection unit connected with the orthopedic operation tool and used for reflecting the first optical signal to form a second optical signal;
the tracking unit is used for transmitting the first optical signal to the first reflecting unit, receiving the second optical signal fed back by the first reflecting unit and transmitting the second optical signal to the processing unit;
A processing unit for determining surgical guidance information based on the second optical signal and the positioning information of the surgical site; the surgical guidance information is used to assist in guiding the orthopaedic surgical tool to a desired insertion location on the surgical site and/or to assist in guiding the orthopaedic surgical tool to a desired insertion direction;
and the prompting unit is used for prompting the operation guiding information.
Optionally, the surgical guidance information includes at least one of the following:
first displacement deviation information of a current position of the orthopaedic surgical tool relative to a reference position;
first angular deviation information of a current position of the orthopaedic surgical tool relative to a reference position;
second displacement deviation information of the current position of the orthopaedic surgical tool relative to the expected insertion position;
second angular deviation information of the current direction of the orthopaedic surgical tool relative to the intended insertion direction;
optionally, the positioning information of the surgical site includes a three-dimensional model of the surgical site, the three-dimensional model of the surgical site including a target insertion position and/or a target insertion direction, the target insertion position on the three-dimensional model of the surgical site corresponding to the intended insertion position and the target insertion direction corresponding to the intended insertion direction.
Optionally, the processing unit is further configured to:
determining a real-time position and a real-time angle of the three-dimensional model of the orthopedic operation tool according to the second optical signal;
and determining the operation guiding information according to the relative relation between the real-time position of the three-dimensional model of the operation tool and the target insertion position and/or the relative relation between the real-time angle of the three-dimensional model of the operation tool and the target insertion direction.
Optionally, the processing unit is further configured to:
receiving a setting instruction, and determining a target insertion position and/or a target insertion direction according to the setting instruction;
or, calling the trained image recognition model, and obtaining the target insertion position and/or the target insertion direction according to the image recognition model and the three-dimensional model of the operation part.
Optionally, the processing unit is further configured to:
acquiring a first reflecting unit coordinate system corresponding to the first reflecting unit, and a first space transformation matrix corresponding to the tracking unit coordinate system corresponding to the tracking unit;
acquiring a second space transformation matrix of an operation tool coordinate system corresponding to the bone operation tool relative to the first reflection unit coordinate system;
obtaining a third space transformation matrix of the surgical tool coordinate system relative to the tracking unit coordinate system according to the first space transformation matrix and the second space transformation matrix;
And determining the real-time position and the real-time angle of the three-dimensional model of the surgical tool in the coordinate system of the tracking unit according to the second optical signal and the third space transformation matrix.
Optionally, the surgical device further comprises a second reflecting unit arranged on the surgical site; the tracking unit is also used for receiving a third optical signal returned by the second reflecting unit; the processing unit is further configured to:
acquiring a reflecting unit pose information matrix of the second reflecting unit in a tracking unit coordinate system corresponding to the tracking unit;
acquiring a fourth spatial transformation matrix of the surgical site relative to the second reflection unit;
obtaining an operation position and pose information matrix of the operation position in a tracking unit coordinate system according to the reflection unit position and pose information matrix and the fourth space transformation matrix;
and determining the position and the angle of the surgical site of the three-dimensional model of the surgical site in the tracking unit coordinate system according to the third optical signal and the surgical site pose information matrix.
Optionally, the processing unit is further configured to:
calling a trained statistical shape model;
and (5) inputting the positioning information of the operation part into the statistical shape model to obtain a three-dimensional model of the operation part.
Optionally, the prompting unit comprises a display screen or/and augmented reality glasses.
The invention also provides a medical navigation method, which is applied to a processing unit of medical navigation equipment, the medical navigation equipment also comprises a tracking unit, a first reflecting unit and a prompting unit, the first reflecting unit is connected with an orthopedic operation tool, and the medical navigation method comprises the following steps:
receiving a second optical signal transmitted by the tracking unit, wherein the second optical signal is formed after the first optical signal transmitted by the tracking unit is reflected by the first reflecting unit;
determining surgical guidance information based on the second optical signal and the positional information of the surgical site; the guiding information is used to assist in guiding the orthopaedic surgical tool to a desired insertion location on the surgical site and/or to assist in guiding the orthopaedic surgical tool to an desired insertion direction;
and transmitting the operation guiding information to the prompting unit so that the prompting unit prompts the operation guiding information.
Compared with the prior art, the invention has the beneficial effects that: according to the invention, a doctor can acquire operation guiding information through the prompting unit, then operate the orthopedic operation tool to move to the expected insertion position on the operation position according to the operation guiding information, and also can assist in guiding the orthopedic operation tool to adjust to the expected insertion direction, so that the doctor can accurately position the expected insertion position (bone marrow nail feeding point) on the operation position, accurately control the expected insertion direction (nail feeding direction), reduce operation times, shorten operation time, and improve operation quality and efficiency.
Drawings
In order to more clearly illustrate the embodiments or the technical solutions in the prior art, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a medical navigation device according to an embodiment of the present invention;
FIG. 2 is a front view of a femoral component in accordance with an embodiment of the present invention;
FIG. 3 is a side view image of a femoral component in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram of a first reflective unit according to an embodiment of the invention;
FIG. 5 is a schematic diagram of a second embodiment of a first reflective unit;
FIG. 6 is a schematic diagram of a third embodiment of a first reflective unit;
FIG. 7 is a schematic illustration of an intended insertion position according to an embodiment of the present invention;
FIG. 8 is a schematic view of the intended insertion direction according to an embodiment of the present invention;
FIG. 9 is a flowchart of a medical navigation method according to an embodiment of the present invention;
fig. 10 is a flowchart of a medical navigation method according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. Furthermore, the terms "first," "second," "third," and the like in embodiments of the present invention are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. In embodiments of the invention, "for example," example, "and" such as "are used to mean" serving as an example, instance, or illustration. Any embodiment described herein as "for example," "example," and "such as" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the invention. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and processes have not been described in detail so as not to obscure the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The embodiment of the invention provides medical navigation equipment which can be used for assisting in guiding an orthopedic operation tool, in particular to intramedullary nail feeding point navigation. As shown in fig. 1, the medical navigation device includes a first reflection unit 1, a tracking unit 2, a processing unit 3, and a presentation unit 4. Wherein:
the first reflection unit 1 is connected with the orthopedic operation tool 5 and is used for reflecting the first optical signal to form a second optical signal; the first reflecting unit 1 may be an object with optical reflecting capability, such as a ball with a reflective material coated on the surface or a ball made of the reflective material, and the orthopedic operation tool 5 may include a puncture guide needle for searching a nail feeding position and a nail feeding direction of the intramedullary nail.
The tracking unit 2 is configured to transmit a first optical signal to the first reflecting unit 1, receive a second optical signal fed back by the first reflecting unit 1, and send the second optical signal to the processing unit 3; the tracking unit 2 may be an optical camera, more specifically, an infrared optical camera (for example, a binocular or trinocular infrared optical camera) and may send infrared light outwards. The infrared optical camera emits infrared light signals outwards to form first optical signals, the first optical signals are irradiated on the first reflecting unit 1 and reflected to form second optical signals, the optical camera receives the second optical signals, and positioning tracking of the first reflecting unit 1 is achieved according to the second optical signals.
The processing unit 3 is used for determining operation guiding information according to the second optical signal and the positioning information of the operation part 6; the surgical guidance information is used to assist in guiding the orthopaedic surgical tool 5 to a desired insertion position on the surgical site 6 and/or to assist in guiding the orthopaedic surgical tool 5 to a desired insertion direction, including in particular: after the orthopedic operation tool 5 is moved to the expected insertion position, the orthopedic operation tool 5 is guided to be adjusted to the expected insertion direction in an auxiliary manner; or the bone surgery tool 5 is moved to the desired insertion position after being oriented in the same direction as the desired insertion direction.
Wherein the operative site 6 comprises a femur, the intended insertion site comprising a pre-set intramedullary nail entry point; further, the intended insertion direction includes a predetermined intramedullary nail insertion direction.
In a femoral clinical operation, as shown in fig. 2, based on the positive piece, the expected insertion position may be the vertex of the greater trochanter of the femur, and the expected insertion direction may be the direction of an included angle of 5 ° between the puncture guide pin and the axis of the femoral shaft; as shown in fig. 3, on the side position piece, the puncture guide needle is approximately coincident with the axis of the femur neck, and at the moment, the position of the nail feeding point and the nail feeding direction are ideal.
The prompting unit 4 is used for prompting operation guiding information. Wherein the prompting unit 4 may comprise a display screen or AR (augmented reality) glasses.
Through the medical navigation equipment of the embodiment, a doctor can acquire operation guide information through the prompting unit 4, then operate the guide pin to move to an expected intramedullary nail feeding point on the femur according to the operation guide information, and can assist in guiding the puncture guide pin to adjust to an expected nail feeding direction when the guide pin reaches the intramedullary nail feeding point, so that the doctor can accurately position the intramedullary nail feeding point on the operation part 6 and accurately control the nail feeding direction, the operation times are reduced, the operation time is shortened, and the operation quality and efficiency are improved.
In this embodiment, the operation guidance information includes at least one of the following information:
first displacement deviation information of the current position of the orthopaedic surgical tool 5 relative to the reference position; for example, the reference position is a specified one of the known positions (coordinates a, a=0 if the reference position is the origin), the current position coordinates of the orthopaedic surgical tool 5 are b, and the first displacement deviation information is b-a; the expected insertion position is also known (the coordinate is c), then the position coordinate of the expected insertion position relative to the reference position is c-a, the doctor controls the orthopaedic surgical tool 5 to move, the prompting unit 4 observes the dynamic change of the first displacement deviation information, the displacement distance of the orthopaedic surgical tool 5 is adjusted, and when b-a=c-a, the orthopaedic surgical tool 5 is displaced to the expected insertion position;
First angular deviation information of the current position of the orthopaedic surgical tool 5 relative to the reference position; for example, the reference position has an angle α, the orthopaedic tool 5 has an angle β, and the first angular deviation information is β - α; the expected insertion direction is also known (the direction angle is γ), then the direction angle of the expected insertion position relative to the reference position is γ - α, the doctor controls the orthopaedic surgical tool 5 to move, the prompting unit 4 observes the dynamic change of the first angle deviation information, adjusts the moving direction of the orthopaedic surgical tool 5, and when β - α=γ - α, the insertion direction of the orthopaedic surgical tool 5 is the expected insertion direction;
second displacement deviation information of the current position of the orthopaedic surgical tool 5 relative to the intended insertion position; for example, the current position of the orthopaedic surgical tool 5 is b and the expected insertion position is c, then the second displacement deviation information is b-c; the doctor controls the orthopaedic operation tool 5 to move, the dynamic change of the second displacement deviation information is observed through the prompting unit 4, the displacement distance of the orthopaedic operation tool 5 is adjusted, and when b-c=0, the orthopaedic operation tool 5 is displaced to the expected insertion position;
second angular deviation information of the current direction of the orthopaedic surgical tool 5 relative to the intended insertion direction; in particular, the second angular deviation information indicates the angular deviation between the axis of the orthopaedic tool 5 itself and the intended insertion direction. When the second angular deviation information is 0, it represents that the current self-axial direction of the orthopaedic tool 5 is parallel to the intended insertion direction. Maintaining the current direction of the orthopedic operation tool 5, and obtaining ideal nail feeding points and directions after the displacement reaches the expected insertion position.
On the other hand, in the second angular deviation information, the intended insertion direction is based on the relative direction of the femoral shaft axis. Specifically, for example, the intended insertion direction is a direction of 5 ° out of the axis of the femoral shaft, and when the orthopaedic surgical tool 5 is moved to the femoral shaft axis direction, the doctor adjusts the inclination angle of the orthopaedic surgical tool 5 itself with respect to the femoral shaft axis to 5 ° according to the second angle deviation information to obtain the intended nail feeding angle.
In this embodiment, the positioning information of the surgical site 6 includes a three-dimensional model of the surgical site 6, the three-dimensional model of the surgical site including a target insertion position and/or a target insertion direction, the target insertion position on the three-dimensional model of the surgical site corresponding to the intended insertion position, and the target insertion direction corresponding to the intended insertion direction. The method for acquiring the three-dimensional model of the surgical site can comprise the following steps:
an optical reflector is mounted on the probe and the probe tip is then moved over the outer surface of the surgical site 6 (e.g., the greater trochanter of the femur). At this time, the tracking unit 2 (optical camera) receives the infrared light signal returned from the optical reflector on the probe, and converts the infrared light signal into coordinate values (x G ,y G ,z G ) T The set of the coordinate values is point cloud data, and the point cloud data is a three-dimensional coordinate data set of the probe tip under an infrared optical camera coordinate system;
the point cloud data may be represented as a set { (x) Gi ,y Gi ,z Gi ) T I=1, 2, …, n }, where n is ≡3.
Based on the point cloud data, a three-dimensional model of the operation site can be obtained, wherein the three-dimensional model of the operation site is a three-dimensional STL (Stereolithography) model.
The present embodiment replaces the physical operation site 6 with the three-dimensional model of the operation site, so that the physical operation site 6 does not need to be transmitted by X-rays, and the radiation amount of doctors and patients can be reduced.
In this embodiment, the processing unit 3 is further configured to:
calling a trained statistical shape model; the statistical shape training model is SSM (Statistical Shape Model);
the positioning information of the surgical site 6 is input to the statistical shape model to obtain a three-dimensional model of the surgical site.
Whether the modeling of the three-dimensional model of the surgical site is accurate directly affects the accuracy of the target insertion position and the target insertion direction. In order to improve the accuracy of STL model reconstruction, a large amount of CT image data containing proximal femur anatomical information can be acquired in advance, and statistical shape models representing the vertexes of the greater trochanter and nearby anatomical structures can be trained and stored in advance through methods such as deep learning. The point cloud data D acquired by the optical probe can be used as input of the SSM, and the SSM can be used for visualizing and outputting an STL model which is more in line with the proximal anatomical features of the femur of the patient based on the coordinate information of each point in the point cloud data and the topological relation between the points. In addition, the acquisition process of the point cloud and the imaging process of the SSM can be synchronously carried out, namely, a doctor acquires the point cloud data of the femoral trochanter, and calculates the specific shape characteristics of the imaging model in real time while the SSM model is trained in advance.
In this embodiment, the processing unit 3 is further configured to:
determining a real-time position and a real-time angle of the three-dimensional model of the surgical tool of the orthopaedic surgical tool 5 according to the second optical signal;
and determining the operation guiding information according to the relative relation between the real-time position of the three-dimensional model of the operation tool and the target insertion position and/or the relative relation between the real-time angle of the three-dimensional model of the operation tool and the target insertion direction.
In the embodiment, the position relationship and the direction relationship between the orthopedic operation tool 5 and the operation part 6 are converted into the data of the position relationship and the direction relationship between the three-dimensional model of the operation tool and the three-dimensional model of the operation part, and the data are calculated and processed by the processing unit 3, so that the displacement and the angle adjustment of the orthopedic operation tool 5 can be controlled more accurately.
In this embodiment, the processing unit 3 is further configured to:
receiving a setting instruction, and determining a target insertion position and/or a target insertion direction according to the setting instruction; in particular, the surgeon may view the three-dimensional model of the surgical site, empirically selecting a target insertion location and/or a target insertion direction on the three-dimensional model of the surgical site. Then, a setting instruction is sent to the processing unit 3, and the determined target insertion position and/or target insertion direction are input to the processing unit 3.
Or, calling the trained image recognition model, and obtaining the target insertion position and/or the target insertion direction according to the image recognition model and the three-dimensional model of the operation part. Wherein the image recognition model may be a convolutional neural network. The determination of the target insertion position and/or the target insertion direction by the image recognition model may be applicable to all doctors without depending on the experience of the doctors, and the target insertion position and the target insertion direction may be more accurate as training is optimized, compared to subjective experience of the doctors.
In this embodiment, the processing unit 3 is further configured to:
(S11) acquiring a first space transformation matrix of the first reflection unit coordinate system corresponding to the first reflection unit 1, with respect to the tracking unit coordinate system corresponding to the tracking unit 2. Specifically, the tracking unit 2 may acquire, in real time, spatial pose (position and pose) information of the first reflection unit 1, where the pose information is represented by a pose matrix, and the first spatial transformation matrix is assumed to be a pose matrix R, that is, a spatial transformation matrix of the coordinate system of the first reflection unit relative to the coordinate system of the tracking unit, where R is known information, and may be acquired directly.
(S12) obtaining a second spatial transformation matrix of the surgical tool coordinate system corresponding to the orthopaedic surgical tool 5 with respect to the first reflection unit coordinate system. Specifically, the first reflecting unit 1 is fixedly connected with the orthopedic operation tool 5, and according to the mechanical installation sizes of the first reflecting unit and the orthopedic operation tool, the pose matrix of the operation tool coordinate system relative to the first reflecting unit coordinate system can be calculated, the pose matrix is a second space transformation matrix, the second space transformation matrix is set as A, and the A is also known information, so that the position matrix can be directly obtained.
(S13) obtaining a third space transformation matrix of the surgical tool coordinate system relative to the tracking unit coordinate system according to the first space transformation matrix and the second space transformation matrix. Specifically, let the third spatial transformation matrix be N, the calculation formula of N is as follows:
N=R×A;
the third spatial transformation matrix N also represents the pose matrix of the orthopaedic surgical tool 5 in the tracking unit coordinate system; the first spatial transformation matrix R, the second spatial transformation matrix a and the third spatial transformation matrix N are all 4x4 homogeneous square matrices, which include position information and posture information, and the specific representation mode of N is as follows:
wherein, the matrix r represents the posture transformation of the surgical tool coordinate system relative to the tracking unit coordinate system, also called a direction cosine matrix, and is an unit orthogonal matrix with the dimension of 3x 3; the three-dimensional vector d represents the displacement of the surgical tool coordinate system relative to the tracking unit coordinate system, i.e. the coordinates of the origin of the surgical tool coordinate system in the tracking unit coordinate system.
(S14) determining the real-time position and the real-time angle of the three-dimensional model of the surgical tool in the coordinate system of the tracking unit according to the second optical signal and the third space transformation matrix. Specifically, based on the third spatial transformation matrix N, an arbitrary point P in the surgical tool coordinate system, the set point P has a coordinate (x W ,y W ,z W ,1) T Then the point P is in the tracking unit coordinate system (x G ,y G ,z G ,1) T The calculation formula (1) of (2) is:
wherein, if (x) W ,y W ,z W ,1) T Representing the coordinates of the tip of the orthopaedic surgical tool 5 in the surgical tool coordinate system, then use is made ofThe formula can calculate the position information of the needle point of the orthopedic operation tool 5 in the coordinate system of the tracking unit in real time, and achieves the purpose of tracking and positioning the needle point of the orthopedic operation tool 5 in real time in minimally invasive surgery.
In a specific application process, after the reconstruction of the STL model (three-dimensional model of the operation site) representing the proximal bone anatomy of the femur on the affected side is completed, it is generally required that the relative positional relationship between the patient position and the tracking unit 2 cannot be changed, that is, neither the operation site of the patient nor the tracking unit 2 can be moved. Because once the relative movement of the model and the model occurs, pose information of the STL model under the tracking unit coordinate system reconstructed by using the point cloud data cannot accurately represent the latest pose information of the proximal femur structure of the affected side after the movement, so that the operation guiding information is not accurate any more. The position of the patient relative to the tracking unit 2 is inevitably changed during the operation, such as carelessly touching the support of the tracking unit 2 by a medical staff.
Based on the above, in this embodiment, the medical navigation device further includes a second reflection unit 7, where the second reflection unit 7 is installed on the operation site 6, and the tracking unit 2 is further configured to receive a third optical signal returned by the second reflection unit 7, and the third optical signal is also an infrared light signal; specifically, the second reflecting unit 7 is also an optical reflector with light reflecting capability, and the second reflecting unit 7 can be fixed on the operation site 6 by using a tool such as a kirschner wire, so that once the relative position relationship between the subsequently reconstructed three-dimensional model of the operation site and the second reflecting unit 7 is established, the relative position relationship will not change due to the change of the body position of the patient or the movement of the tracking unit 2. Further, the processing unit 3 is further configured to:
(S21) acquiring a reflecting unit pose information matrix of the second reflecting unit 7 in the tracking unit coordinate system corresponding to the tracking unit 2. Specifically, the reflecting unit pose information matrix may represent a spatial transformation matrix of the coordinate system of the second reflecting unit in the tracking unit coordinate system, where the reflecting unit pose information matrix is set as R M ,R M Is known information and can be obtained directly.
(S22) acquiring a fourth spatial transformation matrix of the surgical site 6 with respect to the second reflection unit 7. The second reflecting unit 7 is fixedly connected with the operation position 6, and the relative position relationship between the reconstructed three-dimensional model of the operation position and the second reflecting unit 7 can be obtained by calculating pose information matrixes of the two in the coordinate system of the tracking unit respectively at the time of reconstruction completion, so that a fourth space transformation matrix is also known, and the fourth space transformation matrix is set as M.
(S23) obtaining the pose information matrix of the operation site 6 in the tracking unit coordinate system according to the pose information matrix of the reflection unit and the fourth space transformation matrix. Wherein the surgical site 6 pose information matrix may represent a spatial transformation matrix of the surgical site 6 coordinate system relative to the tracking unit coordinate system. Let the information matrix of the 6 pose of the operation part be R S ’,R S The' calculation formula is as follows:
R′ S =R M ×M
(S24) determining the position of the operation site 6 and the angle of the operation site 6 in the tracking unit coordinate system according to the third optical signal and the operation site 6 pose information matrix. For specific algorithms reference is made to the calculation formula (1) above.
Therefore, the real-time positioning of the three-dimensional model of the surgical site is realized, and the correct relative spatial relationship with the three-dimensional model of the surgical tool can be still maintained after the position of the patient body position relative to the tracking unit 2 is changed, so that the real-time navigation result is still correct.
In this embodiment, the prompting unit 4 includes a display screen and/or augmented reality glasses. The operation guiding information can be displayed through the display screen, so that doctors can observe conveniently. The doctor wears the augmented reality glasses, can not need the display screen of turning one's head observation next door, promotes doctor's operation efficiency.
In this embodiment, as shown in fig. 4, the first reflection unit 1 may include a handle 11 and an optical reflector 12, the optical reflector 12 being disposed at an upper end of the handle 11, and a lower end of the handle 11 may be connected with the orthopedic operation tool 5. Specifically, the handle 11 may have a "Y" shape structure including a main rod 111 and two struts 112, the two struts 112 being connected to an upper end of the main rod 111, and a lower end of the main rod 111 being connected to the orthopaedic surgical tool 5. The optical reflectors 12 may be four, two of which are provided on the main lever 111 and the other two of which are provided on the branch levers 112, respectively. Thus, the positioning accuracy of the tracking unit 2 to the first reflecting unit 1 can be improved, and the accuracy of the operation guidance information can be improved.
In another implementation manner of this embodiment, as shown in fig. 5, one or more support rods 112 may be added on the basis of the Y-shaped structure, and as shown in the figure, the handle 11 may include a main rod 111 and three support rods 112, where two optical reflectors 12 are disposed on the main rod 111, and one optical reflector 12 is disposed on each of the three support rods 112.
In another implementation of this embodiment, as shown in fig. 6, the handle 11 may have a "T" shape structure including a cross bar 113 and a vertical bar 114, where one end of the vertical bar 114 is connected to the orthopedic operation tool 5 and the other end is connected to the cross bar 113. Two or more optical reflectors 12 are provided on the vertical bars 114 and the horizontal bars 113, respectively.
In the prior art, proximal femur fractures such as intertrochanteric fractures and diaphysis fractures of tibia, humerus and the like are often treated by adopting an intramedullary nail internal fixation mode, so that fracture healing can be facilitated. The position of the intramedullary nail feeding point and the selection of the nail feeding direction are important in the implementation process of the whole operation. Because advance nail position and advance nail direction and can influence the establishment of later stage expanded marrow passageway and intramedullary nail owner nail and put into position and direction, consequently, the improper selection can lead to the owner nail to be difficult to get into the marrow chamber, even can knock into the marrow chamber by force, also can lead to the owner nail to be in the marrow intracavity atress uneven and warp great, destroys the fracture reduction effect easily this moment, and the far end locking condition of locking inaccuracy takes place more easily when the owner nail is long owner nail, still can lead to fracture department delay healing even healing or abnormal healing when serious.
Taking the proximal femoral intramedullary nail approach as an example, in the clinic, the ideal approach would be to locate the greater trochanter apex and near the femoral neck axis, as shown in fig. 7. The ideal screw feeding direction is 5 deg. offset from the axis of the shaft of the femur to accommodate the external offset angle of the main screw, as shown in fig. 8.
In the operation process, the nail feeding point and the nail feeding direction are confirmed in a mode of driving the guide needle, the needle feeding point of the guide needle is the nail feeding point of the later-stage main nail, and the needle feeding direction of the guide needle is the nail feeding direction of the later-stage main nail. In order to better select the needle feeding point, a doctor is generally required to hold the guide needle by hand and push the tip of the guide needle against the selected needle feeding point before driving the guide needle, at this time, a C-arm X-ray fluoroscopy is performed, and the position of the needle point of the guide needle and the direction of the guide needle are checked on an X-ray image, but the method can certainly increase the radiation amount received by the doctor.
At present, when a domestic doctor performs intramedullary nail operation, a guide pin is firstly driven into after a nail feeding point is estimated by touching the vertex of the greater trochanter, the guide pin is driven into by bare hand, a holder is used for insertion, an electric drill is used for drilling, and the like, the position of the nail feeding point and the nail feeding direction are confirmed according to a perspective image, if the nail feeding point on the positive sheet is positioned at the vertex of the greater trochanter and the included angle between the guide pin and the axis of the femoral shaft is about 5 degrees, the guide pin on the lateral sheet is approximately overlapped with the axis of the femoral neck, and the position of the nail feeding point and the nail feeding direction are ideal. If the position and the direction of the needle feeding point are not ideal, a doctor estimates the position and the angle to be adjusted through a perspective image, drives in a second guide needle under the condition of not pulling out the first guide needle, and perspectively checks the position and the direction of the needle feeding point of the second guide needle again, and repeats until the satisfactory position and the satisfactory direction of the needle feeding point are found.
In clinic, a guide pin which is driven into the guide pin for the first time by a domestic doctor according to subjective experience is difficult to reach an expected nail feeding point, the nail feeding direction is difficult to control, the operation time is greatly prolonged in the adjustment process carried out according to the perspective result in the later period, the adjustment quantity of the position and the angle cannot be accurately controlled, and meanwhile, the perspective times and the radiation quantity of a patient and a doctor are increased.
Aiming at the technical problems, with the medical navigation device of the embodiment of the invention, a doctor can acquire operation guiding information through the prompting unit 4, then operate the orthopedic operation tool 5 to move to an expected insertion position on the operation site 6 according to the operation guiding information, and also can assist in guiding the orthopedic operation tool 5 to adjust to an expected insertion direction, so that the doctor can accurately position the expected insertion position (bone marrow nail feeding point) on the operation site 6 and accurately control the expected insertion direction (nail feeding direction), thereby reducing operation times, shortening operation time and improving operation quality and efficiency.
The embodiment of the invention also provides a medical navigation method which is applied to the processing unit 3 of the medical navigation device provided by the embodiment, and the medical navigation device also comprises a tracking unit 2, a first reflecting unit 1 and a prompting unit 4, wherein the first reflecting unit 1 is connected with an orthopedic operation tool 5. As shown in fig. 9, the medical navigation method includes steps 210, 220 and 230. The method comprises the following steps:
In step 210, the second optical signal transmitted by the tracking unit 2 is received, and the second optical signal is formed after the first optical signal transmitted by the tracking unit 2 is reflected by the first reflection unit 1.
Step 220, determining operation guiding information according to the second optical signal and the positioning information of the operation part 6; the guiding information is used to assist in guiding the orthopaedic surgical tool 5 to a desired insertion position on the surgical site 6 and/or to assist in guiding the orthopaedic surgical tool 5 to a desired insertion direction. The method specifically comprises the following steps: after the orthopedic operation tool 5 is moved to the expected insertion position, the orthopedic operation tool 5 is guided to be adjusted to the expected insertion direction in an auxiliary manner; or the bone surgery tool 5 is moved to the desired insertion position after being oriented in the same direction as the desired insertion direction.
Wherein the positioning information of the surgical site 6 comprises a three-dimensional model of the surgical site 6, the three-dimensional model of the surgical site comprises a target insertion position and/or a target insertion direction, the target insertion position on the three-dimensional model of the surgical site corresponds to the expected insertion position, and the target insertion direction corresponds to the expected insertion direction.
In step 220, the step of determining the surgical guidance information according to the second optical signal and the positioning information of the surgical site 6 may specifically include, as shown in fig. 10:
Step 221, determining the real-time position and real-time angle of the three-dimensional model of the surgical tool of the orthopaedic surgical tool 5 according to the second optical signal;
step 222, determining the surgical guidance information according to the relative relationship between the real-time position of the three-dimensional model of the surgical tool and the target insertion position, and/or the relative relationship between the real-time angle of the three-dimensional model of the surgical tool and the target insertion direction.
Wherein the target insertion position and the target insertion direction can be obtained by the following two methods:
one is, receiving a setting instruction, and determining a target insertion position and/or a target insertion direction according to the setting instruction; in particular, the surgeon may view the three-dimensional model of the surgical site, empirically selecting a target insertion location and/or a target insertion direction on the three-dimensional model of the surgical site. Then, a setting instruction is sent to the processing unit 3, and the determined target insertion position and/or target insertion direction are input to the processing unit 3.
And the other is to call the trained image recognition model, and obtain the target insertion position and/or the target insertion direction according to the image recognition model and the three-dimensional operation part model. Wherein the image recognition model may be a convolutional neural network. The determination of the target insertion position and/or the target insertion direction by the image recognition model may be applicable to all doctors without depending on the experience of the doctors, and the target insertion position and the target insertion direction may be more accurate as training is optimized, compared to subjective experience of the doctors.
In step 220, the pose information of the orthopaedic surgical tool 5 may be located by the second optical signal, and then the pose information of the three-dimensional model of the surgical site is combined to obtain the relative position and the relative spatial angle of the corresponding orthopaedic surgical tool 5 and the surgical site 6. Wherein, the pose information of the bone surgery tool 5 can be obtained by the following method:
(S11) acquiring a first space transformation matrix of the first reflection unit coordinate system corresponding to the first reflection unit 1, with respect to the tracking unit coordinate system corresponding to the tracking unit 2. Specifically, the tracking unit 2 may acquire, in real time, spatial pose (position and pose) information of the first reflection unit 1, where the pose information is represented by a pose matrix, and the first spatial transformation matrix is set to R, that is, a spatial transformation matrix of the coordinate system of the first reflection unit relative to the coordinate system of the tracking unit, where R is known information, and may be acquired directly.
(S12) obtaining a second spatial transformation matrix of the surgical tool coordinate system corresponding to the orthopaedic surgical tool 5 with respect to the first reflection unit coordinate system. Specifically, the first reflecting unit 1 is fixedly connected with the orthopedic operation tool 5, and according to the mechanical installation sizes of the first reflecting unit and the orthopedic operation tool, the pose matrix of the operation tool coordinate system relative to the first reflecting unit coordinate system can be calculated, the pose matrix is a second space transformation matrix, the second space transformation matrix is set as A, and the A is also known information, so that the position matrix can be directly obtained.
(S13) obtaining a third space transformation matrix of the surgical tool coordinate system relative to the tracking unit coordinate system according to the first space transformation matrix and the second space transformation matrix. Specifically, let the third spatial transformation matrix be N, the calculation formula of N is as follows:
N=R×A;
the third spatial transformation matrix N also represents the pose matrix of the orthopaedic surgical tool 5 in the tracking unit coordinate system; the first spatial transformation matrix R, the second spatial transformation matrix a and the third spatial transformation matrix N are all 4x4 homogeneous square matrices, which include position information and posture information, and the specific representation mode of N is as follows:
wherein, the matrix r represents an attitude transformation matrix of the surgical tool coordinate system relative to the tracking unit coordinate system, which is also called a direction cosine matrix, and is an unit orthogonal matrix with a dimension of 3x 3; the three-dimensional vector d represents the displacement of the surgical tool coordinate system relative to the tracking unit coordinate system, i.e. the coordinates of the origin of the surgical tool coordinate system in the tracking unit coordinate system.
(S14) determining the real-time position and the real-time angle of the three-dimensional model of the surgical tool in the coordinate system of the tracking unit according to the second optical signal and the third space transformation matrix. Specifically, based on the third spatial transformation matrix N, an arbitrary point P in the surgical tool coordinate system, the set point P has a coordinate (x W ,y W ,z W ,1) T Then the point P is in the tracking unit coordinate system (x G ,y G ,z G ,1) T Is calculated by the formula (1)The method comprises the following steps:
wherein, if (x) W ,y W ,z W ,1) T And the coordinates of the needle point of the orthopedic operation tool 5 in the operation tool coordinate system are represented, so that the position information of the needle point of the orthopedic operation tool 5 in the tracking unit coordinate system can be calculated in real time by utilizing the formula, and the aim of tracking and positioning the needle point of the orthopedic operation tool 5 in real time in minimally invasive operation is fulfilled.
On the other hand, the accuracy of the pose information of the surgical site 6 also affects the accuracy of the surgical guidance information. Therefore, the second reflection unit 7 is installed on the operation site 6 in the present embodiment, and the tracking unit 2 acquires the third optical signal returned by the second reflection unit 7. The surgical site 6 is then ensured to be positioned accurately by the following steps:
(S21) acquiring a reflecting unit pose information matrix of the second reflecting unit 7 in the tracking unit coordinate system corresponding to the tracking unit 2. Specifically, the reflecting unit pose information matrix may represent a spatial transformation matrix of the coordinate system of the second reflecting unit 7 in the tracking unit coordinate system, where the reflecting unit pose information matrix is set as R M ,R M Is known information and can be obtained directly.
(S22) acquiring a fourth spatial transformation matrix of the surgical site 6 with respect to the second reflection unit 7. The second reflecting unit 7 is fixedly connected with the operation position 6, and the relative position relationship between the reconstructed three-dimensional model of the operation position and the second reflecting unit 7 can be obtained by calculating pose information matrixes of the two in the coordinate system of the tracking unit respectively at the time of reconstruction completion, so that a fourth space transformation matrix is also known, and the fourth space transformation matrix is set as M.
(S23) obtaining the pose information matrix of the operation site 6 in the tracking unit coordinate system according to the pose information matrix of the reflection unit and the fourth space transformation matrix. Wherein the surgical site 6 pose information matrix may represent a spatial transformation matrix of the surgical site 6 coordinate system relative to the tracking unit coordinate system. Is provided withThe information matrix of the 6 pose of the operation part is R S ’,R S The' calculation formula is as follows:
R′ S =R M ×M
(S24) determining the position of the surgical site 6 and the angle of the surgical site 6 in the tracking unit coordinate system of the surgical site three-dimensional model based on the third optical signal and the surgical site 6 pose information matrix. For specific algorithms reference is made to the calculation formula (1) above.
Therefore, the real-time positioning of the three-dimensional model of the surgical site is realized, and the correct relative spatial relationship with the three-dimensional model of the surgical tool can be still maintained after the position of the patient body position relative to the tracking unit 2 is changed, so that the real-time navigation result is still correct.
In step 230 of the present embodiment, the operation guiding information is transmitted to the prompting unit 4, so that the prompting unit 4 prompts the operation guiding information.
Wherein, the operation guiding information at least comprises one of the following: first displacement deviation information of the current position of the orthopaedic surgical tool 5 with respect to the reference position, first angular deviation information of the current position of the orthopaedic surgical tool 5 with respect to the reference position, second displacement deviation information of the current position of the orthopaedic surgical tool 5 with respect to the intended insertion position, second angular deviation information of the current direction of the orthopaedic surgical tool 5 with respect to the intended insertion direction.
By using the medical navigation method of the embodiment of the invention, a doctor can acquire operation guiding information through the prompting unit 4, then operate the orthopedic operation tool 5 to move to the expected insertion position on the operation position 6 according to the operation guiding information, and also can assist in guiding the orthopedic operation tool 5 to adjust to the expected insertion direction, so that the doctor can accurately position the expected insertion position (bone marrow nail feeding point) on the operation position 6 and accurately control the expected insertion direction (nail feeding direction), thereby reducing operation times, shortening operation time and improving operation quality and efficiency.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (11)

1. A medical navigation device for intramedullary nail insertion point navigation, comprising:
a first reflection unit connected with the orthopedic operation tool and used for reflecting the first optical signal to form a second optical signal;
The tracking unit is used for transmitting the first optical signal to the first reflecting unit, receiving the second optical signal fed back by the first reflecting unit and transmitting the second optical signal to the processing unit;
the processing unit is used for determining operation guiding information according to the second optical signal and the positioning information of the operation part; the surgical guidance information is used to assist in guiding the orthopaedic surgical tool to a desired insertion location on the surgical site and/or to assist in guiding the orthopaedic surgical tool to an desired insertion direction; the intended insertion location includes the intramedullary nail insertion point;
and the prompting unit is used for prompting the operation guiding information.
2. A medical navigation device for assisting in guiding an orthopaedic surgical tool, comprising:
a first reflection unit connected with the orthopedic operation tool and used for reflecting the first optical signal to form a second optical signal;
the tracking unit is used for transmitting the first optical signal to the first reflecting unit, receiving the second optical signal fed back by the first reflecting unit and transmitting the second optical signal to the processing unit;
The processing unit is used for determining operation guiding information according to the second optical signal and the positioning information of the operation part; the surgical guidance information is used to assist in guiding the orthopaedic surgical tool to a desired insertion location on the surgical site and/or to assist in guiding the orthopaedic surgical tool to an desired insertion direction;
and the prompting unit is used for prompting the operation guiding information.
3. The medical navigation device according to claim 1 or 2, wherein the surgical guidance information includes at least one of the following information:
first displacement deviation information of a current position of the orthopaedic surgical tool relative to a reference position;
first angle deviation information of a current position of the orthopaedic surgical tool relative to a reference position;
second displacement deviation information of the current position of the orthopaedic surgical tool relative to the expected insertion position;
second angular deviation information of the current direction of the orthopaedic surgical tool relative to the intended insertion direction.
4. Medical navigation device according to claim 1 or 2, characterized in that the positioning information of the surgical site comprises a surgical site three-dimensional model of the surgical site, the surgical site three-dimensional model comprising a target insertion position and/or a target insertion direction, the target insertion position on the surgical site three-dimensional model corresponding to the intended insertion position and the target insertion direction corresponding to the intended insertion direction.
5. The medical navigation device of claim 4, wherein the processing unit is further configured to:
determining a real-time position and a real-time angle of a three-dimensional model of the surgical tool of the orthopedic surgical tool according to the second optical signal;
and determining the operation guiding information according to the relative relation between the real-time position of the three-dimensional model of the operation tool and the target insertion position and/or the relative relation between the real-time angle of the three-dimensional model of the operation tool and the target insertion direction.
6. The medical navigation device of claim 4, wherein the processing unit is further configured to:
receiving a setting instruction, and determining the target insertion position and/or the target insertion direction according to the setting instruction;
or, calling a trained image recognition model, and obtaining the target insertion position and/or the target insertion direction according to the image recognition model and the three-dimensional operation position model.
7. The medical navigation device of claim 5, wherein the processing unit is further configured to:
acquiring a first reflecting unit coordinate system corresponding to the first reflecting unit, and a first space transformation matrix corresponding to the tracking unit coordinate system corresponding to the tracking unit;
Acquiring a second space transformation matrix of the surgical tool coordinate system corresponding to the orthopedic surgical tool relative to the first reflection unit coordinate system;
obtaining a third space transformation matrix of the surgical tool coordinate system relative to the tracking unit coordinate system according to the first space transformation matrix and the second space transformation matrix;
and determining the real-time position and the real-time angle of the three-dimensional model of the surgical tool in the tracking unit coordinate system according to the second optical signal and the third space transformation matrix.
8. The medical navigation device of claim 5, further comprising a second reflective unit mounted on the surgical site; the tracking unit is also used for receiving a third optical signal returned by the second reflecting unit; the processing unit is further configured to:
acquiring a reflecting unit pose information matrix of the second reflecting unit in a tracking unit coordinate system corresponding to the tracking unit;
acquiring a fourth spatial transformation matrix of the surgical site relative to the second reflection unit;
obtaining an operation position and pose information matrix of the operation position in the tracking unit coordinate system according to the reflection unit position and pose information matrix and the fourth space transformation matrix;
And determining the position and the angle of the surgical site of the three-dimensional model of the surgical site in the tracking unit coordinate system according to the third optical signal and the surgical site pose information matrix.
9. The medical navigation device of claim 5, wherein the processing unit is further configured to:
calling a trained statistical shape model;
and inputting the positioning information of the surgical site into the statistical shape model to obtain the three-dimensional model of the surgical site.
10. Medical navigation device according to claim 1 or 2, wherein the prompting unit comprises a display screen or/and augmented reality glasses.
11. A medical navigation method, characterized by being applied to a processing unit of a medical navigation device, the medical navigation device further comprising a tracking unit, a first reflection unit and a prompting unit, the first reflection unit being connected with an orthopaedic surgical tool, the medical navigation method comprising:
receiving a second optical signal transmitted by the tracking unit, wherein the second optical signal is formed after the first optical signal transmitted by the tracking unit is reflected by the first reflecting unit;
determining surgical guidance information based on the second optical signal and the positional information of the surgical site; the guiding information is used for assisting in guiding the orthopaedic surgical tool to move to a desired insertion position on the surgical site and/or assisting in guiding the orthopaedic surgical tool to adjust to a desired insertion direction;
And transmitting the operation guiding information to the prompting unit so that the prompting unit prompts the operation guiding information.
CN202210867988.6A 2022-07-22 2022-07-22 Medical navigation equipment and method Pending CN117462252A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210867988.6A CN117462252A (en) 2022-07-22 2022-07-22 Medical navigation equipment and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210867988.6A CN117462252A (en) 2022-07-22 2022-07-22 Medical navigation equipment and method

Publications (1)

Publication Number Publication Date
CN117462252A true CN117462252A (en) 2024-01-30

Family

ID=89635249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210867988.6A Pending CN117462252A (en) 2022-07-22 2022-07-22 Medical navigation equipment and method

Country Status (1)

Country Link
CN (1) CN117462252A (en)

Similar Documents

Publication Publication Date Title
US20200390503A1 (en) Systems and methods for surgical navigation and orthopaedic fixation
EP4159149A1 (en) Surgical navigation system, computer for performing surgical navigation method, and storage medium
Hofstetter et al. Computer-assisted fluoroscopy-based reduction of femoral fractures and antetorsion correction
JP4439393B2 (en) Robots for use with orthopedic inserts
Bae et al. Computer assisted navigation in knee arthroplasty
Nolte et al. Clinical evaluation of a system for precision enhancement in spine surgery
KR100747138B1 (en) Method for establishing a three-dimensional representation of bone x-ray images
EP2790597B1 (en) A method and a device for computer assisted surgery
CN116602766A (en) Orthopaedics operation system and control method thereof
CN114846517A (en) Determining relative 3D position and orientation between objects in a 2D medical image
US20230255691A1 (en) Adaptive Positioning Technology
US20200352651A1 (en) A method for verifying hard tissue location using implant imaging
Müller et al. Three-dimensional computer-assisted navigation for the placement of cannulated hip screws. A pilot study
EP3484415B1 (en) Surgical site displacement tracking system
EP2852337B1 (en) Entry portal navigation
US20090043190A1 (en) "automatic pointing device for correct positioning of the distal locking screws of an intramedullary nail"
CN116712171B (en) Intertrochanteric fracture navigation method, device and storable medium
CN114224428A (en) Osteotomy plane positioning method, osteotomy plane positioning system and osteotomy plane positioning device
TWM570117U (en) An augmented reality instrument for accurately positioning pedical screw in minimally invasive spine surgery
CN117338420A (en) Intraoperative navigation method, device, equipment and storage medium of orthopedic surgery robot
CN117462252A (en) Medical navigation equipment and method
EP4147661B1 (en) Technique for computer-assisted planning of placement of fasteners in vertebrae that are to be stabilized by a pre-formed spinal rod
CN117462253A (en) Medical navigation equipment and method
JP2024521728A (en) Near real-time continuous 3D registration of objects in 2D X-ray images
EP4375929A1 (en) Systems and methods for registration of coordinate systems based on 2d x-ray imaging and augmented reality device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination