CN116687437A - Medical perspective device, medical navigation system, and medical image processing method - Google Patents

Medical perspective device, medical navigation system, and medical image processing method Download PDF

Info

Publication number
CN116687437A
CN116687437A CN202211549627.3A CN202211549627A CN116687437A CN 116687437 A CN116687437 A CN 116687437A CN 202211549627 A CN202211549627 A CN 202211549627A CN 116687437 A CN116687437 A CN 116687437A
Authority
CN
China
Prior art keywords
medical
position information
navigation
medical image
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211549627.3A
Other languages
Chinese (zh)
Inventor
徐晓龙
郭楚
何智圣
张柳云
陈德方
刘梦星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Mindray Technology Co Ltd
Original Assignee
Wuhan Mindray Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Mindray Technology Co Ltd filed Critical Wuhan Mindray Technology Co Ltd
Publication of CN116687437A publication Critical patent/CN116687437A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/392Radioactive markers

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Primary Health Care (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgical Instruments (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The application relates to a medical perspective device, a medical navigation system and a medical image processing method, wherein the medical perspective device is used for generating a medical image with a position mark; the medical perspective device includes a visualization positioning assembly and/or the medical perspective device provides a visualization positioning function; the development positioning component comprises a position mark which can be visually displayed in a medical image, and is positioned on a perspective path between a ray transmitting end and a ray receiving end of the medical perspective device in the perspective process of the medical perspective device, and the medical perspective device obtains the medical image with the position mark through perspective; the developing and positioning function is to superimpose a position mark on an obtained original medical image by the medical perspective equipment, and obtain the medical image with the position mark.

Description

Medical perspective device, medical navigation system, and medical image processing method
Technical Field
The present application relates to the field of medical technology, and in particular to a medical perspective device, a medical navigation system, a medical image processing method, a computer device, a computer readable storage medium and a computer program product.
Background
In clinical medical orthopaedics, some operations require the insertion of an orthopaedics operation tool or instrument, for example, proximal femur fracture (such as intertrochanteric fracture) or backbone fracture of tibia or humerus, and intramedullary nails are required to be inserted during the orthopaedics operation, and the bone fracture healing is facilitated by adopting an intramedullary nail fixing mode for treatment. When inserting the orthopedic tools or instruments, the specific insertion position and insertion direction of the orthopedic tools or instruments can affect the later surgical effect and the recovery effect after the surgery is finished.
Based on this, a technique to assist in determining the insertion position of an orthopaedic surgical tool or instrument has emerged, however, the inventors have found that the conventional technique to assist in determining the insertion position of an orthopaedic surgical tool or instrument has a problem of low surgical efficiency.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a medical fluoroscopy apparatus, a medical navigation system, a medical image processing method, and a computer apparatus.
In a first aspect, the present application provides a medical fluoroscopy device for generating a medical image in which a location identity is present;
The medical perspective device includes a visualization positioning assembly and/or the medical perspective device provides a visualization positioning function;
the development positioning component comprises a position mark which can be visually displayed in a medical image, and is positioned on a perspective path between a ray transmitting end and a ray receiving end of the medical perspective device in the perspective process of the medical perspective device, and the medical perspective device obtains the medical image with the position mark through perspective;
the developing and positioning function is to superimpose a position mark on an obtained original medical image by the medical perspective equipment, and obtain the medical image with the position mark.
In a second aspect, the present application provides a medical navigation system comprising: a navigation assembly and a medical fluoroscopy device as described above;
at least part of the navigation component is capable of being visually visualized in a medical image;
a medical image of a presence location identifier generated by the medical perspective device, the medical image of the presence location identifier comprising an image of a surgical site and an image of at least a portion of the navigation assembly, the medical image being an image obtained when the navigation assembly is in a reference position; the position identifier is used for determining target offset position information, and the target offset position information is used for indicating the relative position relationship between the reference position and the target position;
The navigation component is used for prompting the first relative position information and/or the second relative position information; the first relative position information is the relative position information between the real-time position of the navigation component and the reference position, and the real-time position is determined by the real-time position information; the second relative position information is the relative position information between the real-time position and the target position, and is determined by the real-time position information and the target offset position information;
the first relative position information, in combination with the target offset position information, and/or the second relative position information, may be used to assist in guiding an orthopaedic surgical tool secured with the navigation assembly to move to the target location.
In a third aspect, the present application provides a medical image processing method, the method comprising:
acquiring a medical image with a position mark, wherein the medical image with the position mark is an image obtained when a navigation component is at a reference position, and comprises an image of a surgical site and at least part of the image of the navigation component;
obtaining target offset position information based on a position identifier in the medical image with the position identifier, wherein the target offset position information is information indicating a relative position relationship between a reference position where the navigation component is located and a target position;
The target offset position information is used for combining the real-time position information of the navigation component fixed with the orthopedic operation tool to determine second relative position information of the real-time position of the navigation component relative to the target position, the second relative position information can be used for assisting in guiding the orthopedic operation tool to move to the target position, and the real-time position is determined by the real-time position information; and/or the target offset position information is used for combining first relative position information of the real-time position of the navigation component fixed with the orthopedic operation tool relative to the reference position, and assisting in guiding the orthopedic operation tool to move to the target position.
In a fourth aspect, the present application provides a computer device comprising a processor and a memory storing a computer program, wherein the computer program, when executed by the processor, causes the processor to carry out the steps of the method as in any of the embodiments described above.
In a fifth aspect, the present application provides a computer readable storage medium having stored thereon a computer program, wherein the computer program, when executed by the processor, causes the processor to implement the steps of the method in any of the embodiments described above.
In a sixth aspect, the application provides a computer program product comprising a computer program, wherein the computer program when executed by a processor implements the steps of the method of any of the embodiments described above.
Based on the embodiments of the present application as described above, the medical fluoroscopy device is able to obtain medical images with location identity through the inclusion of a visualization positioning component and/or through the provision of a visualization positioning function. And then medical navigation can be carried out based on the medical image with the position mark, wherein, based on the position mark in the medical image with the position mark, the target offset position information indicating the relative position relation between the reference position where the navigation component is positioned and the target position can be obtained, and further the first relative position information and/or the second relative position information can be obtained on the basis, so that the orthopedic operation tool fixed with the navigation component can be assisted to be guided to move to the target positions such as the ideal nail feeding point of the intramedullary nail, and the like, the radiation amount in the operation process can not be increased, the convenience and the convenience are realized, and the assistance in improving the efficiency of the orthopedic operation is facilitated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present application, and other drawings can be obtained according to the disclosed drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of a medical perspective device in accordance with one embodiment of the present application;
FIG. 2 is a schematic view of a developing positioning assembly according to an embodiment of the present application;
FIG. 3 is a schematic view of a developing positioning assembly according to another embodiment of the present application;
FIG. 4 is a schematic view of a developing positioning assembly according to another embodiment of the present application;
FIG. 5 is a schematic illustration of a raw medical image obtained by a medical fluoroscopy device without an visualization positioning component or without a visualization positioning function in some embodiments;
FIG. 6 is a schematic illustration of a medical image of a presence location identifier obtained by a medical perspective device according to some embodiments of the application;
FIG. 7 is a schematic illustration of a medical image of a presence location identifier obtained by a medical fluoroscopy device through a visualization positioning function, in accordance with some embodiments of the present application;
FIG. 8 is a schematic structural view of a medical perspective device according to some embodiments of the present application;
FIG. 9 is a schematic view of a medical perspective device according to other embodiments of the present application;
FIG. 10 is a block diagram of a medical navigation system according to some embodiments of the present application;
FIG. 11 is a schematic diagram of a navigation assembly according to some embodiments of the present application;
FIG. 12 is an example of a coordinate system of a sensor of a navigation assembly of some embodiments of the present application;
FIG. 13 is a schematic diagram of a navigation assembly displaying first relative position information and/or second relative position information in some embodiments of the application;
FIG. 14 is a schematic diagram of determining a height difference in some embodiments of the application;
FIG. 15 is a schematic diagram of an external bias angle determination in some embodiments of the application;
FIG. 16 is a schematic illustration of determining pitch angle in some embodiments of the application;
FIG. 17 is a schematic view of the ideal feed position of the ideal feed point of the intramedullary nail at the proximal end of the femur;
FIG. 18 is a schematic view of the ideal feed direction of the ideal feed point of the intramedullary nail at the proximal femur;
FIG. 19 is a schematic view of a navigation assembly placed on a proximal body surface of a femur on a patient side in some embodiments;
FIG. 20 is a schematic illustration of a medical image of a presence location identifier obtained after placement of a navigation assembly on a proximal body surface of a patient side femur in some embodiments;
FIG. 21 is a schematic diagram of determining target offset location information based on medical images with location identification in some embodiments;
FIG. 22 is a schematic illustration of a navigation assembly secured to a lead in some embodiments;
fig. 23 is a schematic view of a navigation assembly secured to a drill in some embodiments;
FIG. 24 is a schematic illustration of ideal feed positions for determining ideal feed points for an intramedullary nail in an application scenario of some embodiments;
FIG. 25 is a schematic illustration of an ideal feed direction for determining an ideal feed point for an intramedullary nail in an application scenario of some embodiments;
FIG. 26 is a flow chart of a medical image processing method of some embodiments;
FIG. 27 is a block diagram of a computer device in one embodiment.
Detailed Description
Reference will now be made in detail to examples of such embodiments, examples of which are illustrated in the accompanying drawings. Numerous specific details are set forth in the following detailed description in order to provide a thorough understanding of the various described embodiments. However, it will be understood by those of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure embodiments.
The terminology used in the description of the various illustrated embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and in the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, and/or components.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion.
In the description of embodiments of the present application, the technical terms "first," "second," and the like are used merely to distinguish between different objects and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated, a particular order or a primary or secondary relationship. In the description of the embodiments of the present application, the meaning of "plurality" is two or more unless explicitly defined otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Some embodiments of the present application provide a medical fluoroscopy device capable of generating medical images for which a location identity is present.
In some embodiments, referring to FIG. 1, the medical fluoroscopy device 100 includes a visualization positioning component 110 by which the medical fluoroscopy device 100 generates a medical image with a presence location identification.
The developing and positioning component 110 comprises a position mark, the position mark can be visually displayed in a medical image, the developing and positioning component 110 is positioned on a perspective path between a ray emitting end and a ray receiving end of the medical perspective device 100 in the process of the medical perspective device 100 conducting perspective, and the medical perspective device 100 obtains the medical image with the position mark through perspective.
In other embodiments, the medical perspective device 100 can provide a visualization positioning function and generate medical images with presence location identifiers via the visualization positioning function.
Wherein the developing and positioning function is to superimpose the position identifier on the obtained original medical image by the medical perspective device 100, and obtain the medical image with the position identifier.
It should be appreciated that in some embodiments, the medical fluoroscopy device 100 may also have the visualization positioning assembly 110 and provide a visualization positioning function, thereby facilitating medical personnel to obtain medical images with location identifiers during use, in combination with the actual need to select the visualization positioning assembly 110 during fluoroscopy, or through the visualization positioning function.
The position marks superimposed on the original medical image by the developing and positioning function and the position marks of the developing and positioning assembly 110 may be set in the same or different manners, such as an arrangement manner of the position marks, a relative relationship (such as a distance and an angle) between the position marks, and the like. Taking as an example the position identification superimposed on the original medical image by the development positioning function, the position identification of the development positioning assembly 110 is set in the same manner, several of the setting manners of the development positioning assembly 110 are exemplified below.
Referring to FIG. 2, in some embodiments, a developer positioning assembly 110 may include an angular marking 111. The developing and positioning assembly 110 includes a plurality of angle marks 111, so that after the medical perspective device 100 obtains a medical image with the plurality of angle marks 111 through perspective, the medical image is used to view information such as a specific position and an angle of a part to be operated, and the angle marks are used as references to more easily assist in determining an expected insertion direction of a target position, for example, an ideal nail feeding direction of an ideal nail feeding point of an intramedullary nail, so as to assist in guiding an orthopedic operation tool to move to the target position such as the ideal nail feeding point of the intramedullary nail.
In some embodiments, the plurality of angle markers may include a reference angle marker (e.g., angle marker 111 shown bolded in FIG. 2), and a non-reference angle marker disposed on at least one side of the reference angle marker.
Therefore, the angle marks can be conveniently read according to the needs, the reference angle marks can be used as references, and the reference angle marks or the non-reference angle marks corresponding to the expected insertion direction of the target position, such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, in the plurality of angle marks can be found, so that the corresponding angle marks can be conveniently read, and the expected insertion direction of the target position, such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, can be conveniently and assisted. When applied to a scenario in which it is desired to determine the angle between the intended insertion direction and other specified directions (such as the axial direction of the navigation assembly mentioned in the embodiments described below), it may be more convenient to determine the angle between the two directions by assisting in determining the angle identifier corresponding to the intended insertion direction and the angle identifier corresponding to the other specified directions.
In some embodiments, the non-reference angle markings include positive and negative angle markings symmetrically disposed on both sides of the reference angle marking with the reference angle marking as a reference datum. Therefore, through a symmetrical distribution mode, the corresponding angle marks can be read more conveniently, and the expected insertion direction of the target position, such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, can be determined more conveniently in an auxiliary mode. When applied to a scene in which the angle between the intended insertion direction and other specified directions needs to be determined, the angle between the two directions can be determined more conveniently according to the angle.
In some embodiments, the plurality of angle marks may be equiangular, that is, the included angle between any two adjacent angle marks is the same.
The developing position assembly 110 in some embodiments may include a distance indicator. The developing and positioning assembly 110 includes a plurality of distance identifiers, so that after the medical perspective device 100 obtains a medical image with a plurality of distance identifiers through perspective, the medical image is used to view information such as a specific position and a distance of a part to be operated, and the distance identifiers are used as references, so that the expected insertion position of the target position can be more easily assisted to be determined. When applied to a scenario in which it is desired to determine the distance between the intended insertion position and other specified positions (e.g., reference positions of the navigation assembly mentioned in the embodiments described below), the distance between the two positions can be more conveniently determined by assisting in determining the distance identifications corresponding to the intended insertion position and the distance identifications corresponding to the other specified positions.
In some embodiments, at least one set of distance indicators is included on the developing position assembly 110, and a plurality of distance indicators in the same set of distance indicators are disposed equidistantly, i.e., the distance between any two adjacent distance indicators is the same.
Taking the example of two sets of distance identifiers, referring to fig. 3, a plurality of distance identifiers 112, 113 are present on the developing and positioning assembly 110, wherein the plurality of distance identifiers 112 form a first set of distance identifiers and the plurality of distance identifiers 113 form a second set of distance identifiers. The plurality of distance markers 112 in the first set of distance markers are disposed equidistant, the plurality of distance markers 113 in the second set of distance markers are disposed equidistant, and the distance markers 112 are disposed perpendicular to the distance markers 113.
Thus, after obtaining a medical image in which a plurality of distance markers exist through fluoroscopy, the medical fluoroscopy device 100 views information such as a specific position and a distance of a portion to be operated by using the medical image, and can more easily and accurately assist in determining an expected insertion position of a target position with reference to two sets of distance markers perpendicular to each other. When applied to a scene where it is desired to determine the distance between the intended insertion position and other specified positions (such as the reference positions of the navigation assembly mentioned in the embodiments described below), the distance between the two positions can be determined more conveniently and conveniently from two directions perpendicular to each other by using the two sets of mutually perpendicular distance marks as references.
In some embodiments, both an angular identifier and a distance identifier may be included on the developing position assembly 110. Taking a second set of distance indicators comprising a plurality of angle indicators and a plurality of distance indicators 113 as an example, a schematic view of the structure of the developer positioning assembly 110 is shown in fig. 4.
Referring to fig. 4, the developing and positioning assembly includes a plurality of angle marks 111 and a plurality of distance marks 113, and a reference angle mark (e.g., the angle mark 111 shown in bold in fig. 4) in the angle marks 111 is disposed perpendicular to the distance marks 113.
Taking a medical perspective device without a developing and positioning component or without a developing and positioning function as an example, a schematic diagram of an original medical image obtained by perspective is shown in fig. 5, and it can be seen that in the original medical image, it is difficult to determine an expected insertion direction of a target position without image analysis processing.
Taking an example that the medical perspective device of the embodiment of the application has the developing and positioning assembly shown in fig. 2 or the developing and positioning function can be overlapped with the position mark of the developing and positioning assembly shown in fig. 2, the medical image obtained by the developing and positioning assembly for perspective or by the developing and positioning function can be seen as shown in fig. 6, and a plurality of angle marks are arranged on the medical image, so that the expected inserting direction of the target position can be determined by the aid of the angle marks.
The setting position of the developing and positioning assembly 110 may be set in any possible position, as long as the developing and positioning assembly 110 can be located on a perspective path between a radiation emitting end and a radiation receiving end of the medical perspective device when the medical perspective device 100 performs medical perspective, so that the position identifier exists on a medical image obtained by the medical perspective device 100.
Referring to fig. 8, the medical perspective device 100 has a radiation emitting end 120 and a radiation receiving end 130. In the process of medical fluoroscopy, the to-be-operated part of the patient can be located between the radiation emitting end 120 and the radiation receiving end 130 by moving the to-be-operated part of the patient and/or by moving a medical device (such as an operation table) for carrying the patient and/or by moving the medical fluoroscopy device, then the medical fluoroscopy device 100 controls the radiation emitting end 120 to emit radiation, the radiation receiving end 130 receives the radiation emitted by the radiation emitting end 120, and as the radiation passes through the to-be-operated part, the different tissue parts of the to-be-operated part have different transmittances on the radiation, so that after the radiation receiving end 130 receives the radiation, the different tissue parts are distinguished in the medical image based on the different transmittances, thereby obtaining a final medical image.
In some of these embodiments, the visualization positioning assembly 110 may be disposed at the radiation emitting end 120 of the medical fluoroscopy device 100. Therefore, in the medical perspective process, the rays sent by the ray emitting end 120 sequentially pass through the developing and positioning assembly 110 and the part to be operated to reach the ray receiving end 130, so that the position mark of the developing and positioning assembly 110 and the part to be operated can be imaged in the finally obtained medical image.
In some of these embodiments, the visualization positioning assembly 110 may be disposed at the radiation receiving end 130 of the medical fluoroscopy device 100. Therefore, in the medical perspective process, the rays sent by the ray emitting end 120 sequentially pass through the to-be-operated part and the developing and positioning assembly 110 and reach the ray receiving end 130, so that the position mark of the developing and positioning assembly 110 and the to-be-operated part can be imaged in the finally obtained medical image.
In some embodiments, the radiation receiving end 130 of the medical fluoroscopy device 100 includes an image intensifier and the positional identification of the visualization positioning component 110 is located on an image input screen of the image intensifier. Therefore, the medical image with the position marks can be obtained through perspective by directly arranging a plurality of position marks on the image input screen of the image enhancer without providing additional independent equipment, and the method is simple and convenient.
In some embodiments, the developing and positioning function provided by the medical perspective device 100 may be that the medical perspective device 100 superimposes a preset layer on the obtained original medical image to obtain a medical image with a location identifier; wherein, there is a position mark in the preset layer.
The medical perspective device 100 may be provided with a physical button having a development positioning function, and in the case where the medical perspective device 100 has a display screen, the physical button may be a button having a development positioning function displayed on the display screen. Medical staff can use the developing and positioning function by pressing the entity key or operating a button of the developing and positioning function in a touch or mouse click mode, and a preset layer is superposed on an original medical image based on the developing and positioning function.
The preset image layer may be stored in advance on the medical perspective device 100, and when the preset image layer is superimposed on the original medical image, the preset image layer is directly obtained, and the preset image layer is superimposed on the original medical image.
In some embodiments, when the preset map layer is superimposed on the original medical image, the preset map layer may be superimposed on the whole image of the original medical image, so that the preset map layer may be superimposed without concern about the specific position of the to-be-operated site.
In other embodiments, when the preset map layer is superimposed on the original medical image, the preset map layer may be a preset position of the original medical image, which may be a position in the original medical image set in advance, for example, an intermediate position of the original medical image, or a position where the mouse is located in case of operation by the mouse. Then, the specific position of the preset layer can be adjusted in response to a user operation, for example, the preset layer is dragged to the part to be operated in response to the user operation, so that the preset layer is superimposed on the original medical image, and the medical image with the position mark is obtained.
In other embodiments, after the medical perspective device 100 superimposes the preset map onto the original medical image, the preset map layer may also be scaled to a suitable size in response to a user operation, for example, scaling the preset map layer to a position just covering the site to be operated on, so as to obtain a medical image with a final presence location identifier.
Taking the example that the medical perspective device of the embodiment of the application has the developing and positioning function, and the position identifier provided by the developing and positioning function is the same as the position identifier of the developing and positioning component shown in fig. 2, the medical image obtained by the developing and positioning function can also be shown in fig. 7. It can be seen that on this medical image, the position identification provided on the medical image can be superimposed only on top of the image portion of the surgical site in the medical image, by means of the angle identification superimposed on top of the image portion of the surgical site in the medical image, the intended direction of insertion of the target position can be assisted to be determined. When applied to a scene in which the angle between the intended insertion direction and other specified directions needs to be determined, the angle between the two directions can be determined more conveniently according to the angle.
Referring to fig. 8 and 9, the medical image of the presence location identifier obtained by the medical perspective device 100 may be displayed on the display screen 140 of the medical perspective device 100, or the medical perspective device 100 may send the medical image to other terminal devices in a wired or wireless manner, and may be displayed on the display screen 150 of the other terminal devices.
Based on the medical perspective device 100 as described above, the embodiment of the present application also provides a medical navigation system. As shown in fig. 10, the medical navigation system 10 includes a navigation assembly 200, and a medical perspective device 100 as described above.
Wherein at least a portion of the navigation assembly 200 is capable of being visually visualized in a medical image;
the medical perspective device 100 generates a medical image of the presence location identifier, which is an image obtained when the navigation assembly 200 is in the reference position, the medical image of the presence location identifier comprising an image of the surgical site and an image of at least part of the navigation assembly; the position identifier is used for determining target offset position information, and the target offset position information is used for indicating the relative position relation between the reference position and the target position;
a navigation component 200 for prompting the first relative position information and/or the second relative position information; the first relative position information is the relative position information between the real-time position of the navigation component and the reference position, and the real-time position is determined by the real-time position information; the second relative position information is the relative position information between the real-time position and the target position, and is determined by the real-time position information and the target offset position information;
the first relative position information in combination with the target offset position information and/or the second relative position information may be used to assist in guiding the orthopaedic surgical tool to move to the target location.
The target location may be the location of any site at which a procedure is to be performed, and in some embodiments may include an ideal approach point for an intramedullary nail.
The first relative position information and/or the second relative position information may be determined by the navigation module 200 itself or may be determined by an external device different from the navigation module 200. In the case of determination by the external device, the navigation component 200 is configured to send real-time position information of the navigation component 200 to the external device in a state where the navigation component 200 is fixed with the orthopaedic surgical tool, receive the first relative position information and/or the second relative position information fed back by the external device, and prompt the first relative position information and/or the second relative position information. In this case, the reference position is determined from pre-recorded reference position information including the first reference position information and/or the second reference position information. The first reference position information is position information of the navigation component 200, which is obtained by the navigation component 200 in response to a trigger instruction and sent to the external device, and the second reference position information is preset position information obtained by the external processing device in response to the trigger instruction.
Wherein the navigation assembly 200 can be removably secured to an orthopedic surgical tool. For example, the navigation assembly 200 may be removably secured to the orthopaedic surgical tool by magnetic attraction and/or snap-fit. The orthopedic surgical tool may be a guide pin, a holder, or an electric drill.
In some embodiments, the navigation assembly 200 may be provided with a touch screen, so that the navigation assembly 200 may receive a trigger instruction through the touch screen, and the trigger instruction may be used to indicate the reference position information in which the reference position is recorded.
The trigger instruction can be sent out by operating a corresponding button or control displayed on the touch screen, clicking or double-clicking in a designated area or any position of the touch screen, or sliding a designated track on the touch screen, or by other touch modes. The navigation component 200 determines that a trigger instruction is received if an operation action for operating a corresponding button or control is received through the touch screen, or a specified touch action, such as a single click, a double click, or a specified sliding track, is identified.
In some embodiments, the navigation component 200 may also be provided with a physical key and/or a voice acquisition component to receive a trigger instruction through the physical key and/or the voice acquisition component, where the trigger instruction may be used to indicate the reference position information in which the reference position is recorded.
If the navigation module 200 is provided with a physical key, a trigger command may be issued by directly pressing the physical key. When the navigation module 200 receives the pressing operation through the entity key, it determines that the trigger instruction is received.
If the navigation assembly 200 is provided with a voice acquisition assembly, medical personnel (e.g., a doctor) can issue a "record reference location", "record location" or other voice message to issue the trigger instruction. The voice acquisition component acquires voice information and identifies the voice information. If it is recognized that the voice message includes "recording reference position", "recording position", or other predefined reference position information indicating the recording reference position, the navigation component 200 determines that the trigger command is received.
In some embodiments, the trigger instruction may be received by an external device other than the navigation module 200. The external device may be, for example, the medical fluoroscopy device 100 or may be a device other than the medical fluoroscopy device 100. The trigger instruction received by the external device may be forwarded to the navigation component 200.
The navigation module 200 may display the first relative position information and/or the second relative position information in real time, or output the voice information of the first relative position information and/or the second relative position information, that is, voice prompt is performed on the voice information of the first relative position information and/or the second relative position information.
The manner of displaying the first relative position information and/or the second relative position information in real time is not limited, and for example, only the first relative position information and/or the second relative position information may be displayed, or the first relative position information and/or the second relative position information may be displayed simultaneously with the corresponding description information, which is used to explain the specific meaning of the displayed first relative position information and/or second relative position information. In some embodiments, information of the suggested movement direction given on the basis of the first relative position information and/or the second relative position information may also be displayed simultaneously, wherein different movement directions may be marked with different marks, e.g. different colors and/or different arrow directions, respectively.
The manner in which the first relative position information and/or the second relative position information are voice-prompted in real time is not limited, for example, only the first relative position information and/or the second relative position information are voice-prompted, and the first relative position information and/or the second relative position information can be simultaneously voice-prompted, and the description information corresponding to the first relative position information and/or the second relative position information is voice-prompted, which is used for describing the specific meaning of the first relative position information and/or the second relative position information of the voice prompt, and in some embodiments, the information of the suggested moving direction given based on the first relative position information and/or the second relative position information can be simultaneously voice-prompted.
Wherein, the navigation module 200 may be provided with sensors, and the number of the sensors may be 1. The sensor may be integrated in the navigation module 200, and the sensor measures the obtained position information and may be used as the position information of the navigation module 200, so that the medical navigation process can be implemented through one sensor. In the following description of the embodiments, the positional information of the navigation module 200 may be positional information obtained by measuring sensors of the navigation module 200.
In some embodiments, the reference position information is position information obtained by measuring in real time by a sensor obtained in response to a trigger instruction when the trigger instruction is received. When the navigation module 200 receives the trigger instruction, the navigation module 200 can directly obtain the position information obtained by real-time measurement of the sensor. In the case that the first relative position information and/or the second relative position information are determined by the external device, the external device may also receive the trigger instruction, and the external device may forward the trigger instruction to the navigation component 200, or send an instruction instructing the navigation component 200 to provide the position information to the navigation component 200 based on the trigger instruction, so as to instruct the navigation component 200 to obtain the position information obtained by measuring the sensor in real time, and the reference position information obtained in this way may be referred to as the first reference position information in the embodiment of the present application.
In some embodiments, the reference position information is position information obtained by measuring the sensor of the navigation module 200 after initializing the sensor of the navigation module 200 in response to the trigger command when the trigger command is received, that is, the reference position information may be initial position information of the sensor, which may be referred to as second reference position information in embodiments of the present application. The initial position information may be set based on actual technical needs, and in some embodiments, the initial position information may be zero position information. The zero position information may mean that the values of the information related to the position are all set to 0, for example, the height value is 0.
When the navigation module 200 receives the trigger instruction, the navigation module 200 may directly initialize the sensor to obtain the position information obtained by the measurement after the sensor is initialized. In the case where the first relative position information and/or the second relative position information are determined by the external device, the external device may also receive the trigger instruction, and in some embodiments, the external device may use, as the reference position information, position information preset in the external device, where the preset position information is initial position information after the sensor is initialized. When the external device receives the trigger instruction, the external device may forward the trigger instruction to the navigation component 200 at the same time, or send an initialization instruction to the navigation component 200 based on the trigger instruction, so as to instruct the navigation component 200 to initialize the sensor.
When the external device receives the trigger instruction, the trigger instruction may be forwarded to the navigation component 200, or an instruction may be sent to the navigation component 200 based on the trigger instruction, where the instruction instructs the navigation component 200 to initialize the sensor and provide the position information obtained by sensor measurement after the sensor is initialized. The navigation module 200 receives the instruction, initializes the sensor, obtains position information obtained by measurement of the sensor, and feeds back to the external device, thereby enabling the external device to obtain the reference position information.
In some embodiments, the first relative position information comprises a first displacement offset and/or a first angular offset between the real-time position and the reference position.
The first displacement bias may include: in a direction perpendicular to the horizontal plane, the distance between the real-time position of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) and the reference position. Wherein in some embodiments, the first displacement deviation may also be referred to as a height difference, as shown in fig. 14.
The first angular deviation may include: the specified direction of the navigation module 200 (i.e. the sensor of the navigation module 200) fixed with the orthopedic operation tool and the specified direction of the navigation module 200 (i.e. the sensor of the navigation module 200) when in the reference position are projected to the same plane, and then the included angle between the two projections is formed on the plane. Wherein, in some embodiments, the first angle deviation may also be referred to as an external deviation angle.
In some embodiments, the specified direction of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) may be specifically the axial direction of the navigation assembly 200, and the same plane may be specifically a horizontal plane, as shown in fig. 15. Wherein, when the specified direction of the navigation component 200 (i.e. the sensor of the navigation component 200) fixed with the orthopedic operation tool and the specified direction of the navigation component 200 (i.e. the sensor of the navigation component 200) when in the reference position are projected, the projection can be performed in combination with the coordinate system established by the sensor. The coordinate system established by the sensor itself in one example is shown in fig. 12, and the X-axis of the coordinate system established by the sensor itself in fig. 12 is the axis direction of the navigation module 200, that is, the designated direction of the navigation module 200 (i.e., the sensor of the navigation module 200).
In some embodiments, the second relative positional information includes a second positional deviation and/or a second angular deviation between the real-time position and a target position, wherein the target position is determined by an intended insertion position and/or an intended insertion direction of the orthopaedic surgical tool in the human body. In some embodiments, the target location may be a location of the desired insertion of the surgical tool such as an ideal intramedullary nail insertion point, and the target location is the ideal intramedullary nail insertion point, and the second relative positional information includes a second displacement offset and/or a second angular offset between the real-time location and the ideal intramedullary nail insertion point, where the desired insertion location is the ideal nail insertion location of the ideal intramedullary nail insertion point and the desired insertion direction is the ideal nail insertion direction of the ideal intramedullary nail insertion point.
The second angular deviation includes: the specified direction of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) at the real-time position and the intended insertion direction of the target position, such as the ideal screw feeding direction of the ideal screw feeding point of the intramedullary nail, are projected onto the same plane, and then the angle between the two projections on the same plane is formed. In some embodiments, the specified direction of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) may be specifically an axial direction of the navigation assembly 200, and the same plane may be specifically a horizontal plane.
Wherein the desired insertion direction, e.g. the desired screw-in direction of the desired screw-in point of the intramedullary screw, can be determined in various possible ways.
In some embodiments, the desired insertion direction, such as the desired approach direction of the desired approach point of the intramedullary nail, may be obtained by a medical professional viewing the medical image, for example, by taking a photograph of the medical image with the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) in a reference position, the visual appearance of the medical image having an angular and/or distance identification. By viewing the medical image, the medical personnel can determine the intended insertion location and/or the intended insertion direction of the target location (e.g., the intended insertion direction such as the ideal screw insertion direction of the ideal screw insertion point of the intramedullary nail).
In some embodiments, the desired insertion direction, e.g., the desired screw insertion direction of the desired screw insertion point of the intramedullary nail, is a user-provided insertion direction. For example, when the navigation assembly 200 is in the reference position, a photograph is taken to obtain a medical image in which the visual appearance has an angular identification. Medical staff can determine the expected insertion direction such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail by looking at the medical image, and observe and determine the angle mark parallel to the expected insertion direction such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail by combining with the displayed angle mark, or measure and obtain the angle mark parallel to the expected insertion direction such as the ideal nail feeding direction of the marked ideal nail feeding point of the intramedullary nail by measuring through a measuring tool such as a ruler, and then select the angle mark by clicking or inputting information of the angle mark, and determine the expected insertion direction provided for a user, such as the ideal nail feeding direction of the ideal nail feeding point provided for the user.
In some embodiments, the desired insertion direction, such as the desired approach direction of the desired approach point of the intramedullary nail, may also be obtained by analyzing a medical image taken while the navigation assembly 200 is in the reference position. For example, by image processing the medical image, the intended insertion direction, e.g. the ideal screw-in direction of the ideal screw-in point of the intramedullary nail, is identified. Taking the navigation of the intramedullary nail feeding point as an example, the method can also be used for identifying the axis direction of the femur shaft by performing image processing on a medical image, and determining the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail by combining the angle (for example, 5 degrees) between the axis direction of the femur shaft and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
The target offset position information may be determined in conjunction with an angular identification in the medical image to indicate a relative positional relationship between the reference position and the target position (e.g., the ideal nail insertion point of the intramedullary nail). In some embodiments, the target offset position information may include: the angle between the specified direction of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) when in the reference position and the intended insertion direction (e.g., the ideal approach direction) of the target location (e.g., the ideal approach point of the intramedullary nail).
In some embodiments, the angle between the specified direction of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) when in the reference position and the intended insertion direction (e.g., the ideal approach direction) of the target position (e.g., the ideal approach point of the intramedullary nail) may be determined in a variety of possible ways.
In some embodiments, if the medical personnel determines the desired insertion direction (e.g., the desired screw insertion direction of the desired screw insertion point of the intramedullary screw) and the axial direction of the navigation module 200 by viewing the medical image, the displayed angle marks may be combined to determine the angle mark parallel to the desired insertion direction and the angle mark parallel to the axial direction of the navigation module 200, and the angle between these two angle marks may be used as the angle between the desired insertion direction of the target location (e.g., the desired screw insertion direction of the desired screw insertion point of the intramedullary screw) and the designated direction of the navigation module 200 (i.e., the sensor of the navigation module 200) at the reference location. After determining the included angle, the medical staff can record the included angle by himself so as to facilitate the subsequent use. In some embodiments, the healthcare worker may also input the determined included angle to the navigation assembly 200 via an input or other means to facilitate subsequent use.
In some embodiments, if the desired insertion direction (e.g., the desired nail insertion direction of the desired nail insertion point of the intramedullary nail) is obtained by analyzing the medical image, the apparatus for performing image analysis may analyze and determine an angle indicator parallel to the identified desired insertion direction (e.g., the desired nail insertion direction of the desired nail insertion point of the intramedullary nail) from among the displayed angle indicators, and determine the direction corresponding to the angle indicator as the desired insertion direction (e.g., the desired nail insertion direction of the desired nail insertion point of the intramedullary nail). And from the displayed angle identifications, an angle identification parallel to the axial direction of the navigation assembly 200 is analyzed and determined, and the included angle between the two angle identifications is used as the included angle between the designated direction of the navigation assembly 200 (i.e. the sensor of the navigation assembly 200) when the navigation assembly is at the reference position and the expected insertion direction of the target position (i.e. the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail). If the device for performing image processing is different from the device for performing navigation module 200, the device for performing image processing may transmit the included angle θ to navigation module 200 through wired or wireless mode, or may input the included angle θ into navigation module 200 by the medical staff after the medical staff knows the included angle.
Wherein, in the case that the visual appearance in the medical image has a distance identifier, the target offset position information may include: the distance between the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) when in the reference position and the intended insertion position (e.g., ideal insertion position) of the target position (e.g., ideal insertion point of the intramedullary nail).
For example, by viewing the medical image, the medical personnel determines the desired insertion location (e.g., the desired insertion location of the desired insertion point of the intramedullary nail), and may then view the distance identifier that determines the desired insertion location and the distance identifier that is located at the reference location in combination with the displayed distance identifier, and use the distance between the two distance identifiers as the distance between the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) at the reference location and the desired insertion location of the target location (e.g., the desired insertion location of the desired insertion point of the intramedullary nail). In the case where the distance markers are equidistantly disposed, the distance between the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) at the reference position and the intended insertion position of the target position (e.g., the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail) can be determined by combining the number of distance markers spaced between the two distance markers and the distance between the adjacent two distance markers. After determining the distance, the medical staff can record the distance by himself so as to facilitate the subsequent use. In some embodiments, the healthcare worker may also input the determined distance to the navigation assembly 200 by input or other means to facilitate subsequent use.
Wherein the second displacement offset may comprise: in a direction perpendicular to the horizontal plane, the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) is located at a distance between the real-time position and the intended insertion position (e.g., the ideal screw insertion position of the ideal screw insertion point of the intramedullary nail).
Wherein the intended insertion position may be determined based on a predetermined insertion area, which may refer to an area used to determine the intended insertion position. Taking the target location as the ideal approach point for the intramedullary nail at the proximal end of the femur as an example, the predetermined insertion region may be the greater trochanter apex ridge profile. Specific location information of the intended insertion location may be determined during movement of the orthopaedic surgical tool that is fixed with the navigation assembly 200 (i.e., the sensor of the navigation assembly 200).
In some embodiments, the height value of the intended insertion location (e.g., the ideal screw insertion location of the ideal screw insertion point of the intramedullary screw) in a direction perpendicular to the horizontal plane is determined by a first reference height value that is the height value of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) in the direction perpendicular to the horizontal plane when the insertion end of the orthopaedic surgical tool is located at a first side vertex of the predetermined insertion area and a second reference height value that is the height value of the navigation assembly 200 (i.e., the sensor of the navigation assembly 200) in the direction perpendicular to the horizontal plane when the insertion end of the orthopaedic surgical tool is located at a second side vertex of the predetermined insertion area. Taking the target position as an ideal nail feeding point of the intramedullary nail at the proximal end of the femur as an example, the first side vertex can be the vertex of the greater trochanter vertex ridge profile closest to the ventral side, and the second side vertex can be the vertex of the greater trochanter vertex ridge profile closest to the dorsal side.
In some embodiments, the sensor of the navigation assembly 200 can also be used to measure a pitch angle, wherein the pitch angle is the angle between the bottom surface of the navigation assembly 200 and the horizontal plane, as shown in fig. 16.
In some embodiments, the navigation assembly 200 is also used to prompt the sensor to measure the pitch angle.
In some embodiments, the navigation component 200 is also used to provide error cues.
In some embodiments, the error prompt information is information generated when the cumulative time length is greater than a preset time length. The accumulated time length is the time length from the time of recording the reference position information or the time of last calibrating the sensor. In the use process of the sensor, when the use time is too long, errors can be possibly caused, so that the time counting accumulation duration can be started from the time of recording the reference position information or the time of calibrating the sensor last time, and when the accumulation duration is longer than the preset duration, the sensor is considered to have errors, and error prompt information can be provided.
In some embodiments, the error hint information is information generated when the calculated error is greater than an error threshold. During the use of the sensor, the navigation module 200 can monitor and calculate the error of the sensor at the same time, and when the calculated error is greater than the error threshold, error prompt information is generated. The error threshold may be set according to the accuracy of the sensor, and the embodiment of the present application is not specifically limited.
The error prompt information provided by the navigation component 200 may be prompted by the navigation component 200, and the navigation component 200 may also send the error prompt information to an external prompt device for prompting.
When knowing the error prompt information of the prompt, the medical staff determines that the error of the sensor is larger, so that the sensor can be initialized or recalibrated, and the reference position information is redetermined, thereby improving the accuracy of medical navigation. In some embodiments, medical personnel may also directly select the navigation module 200 including other sensors for medical navigation to improve the accuracy of medical navigation.
Taking an intramedullary nail operation as an example, based on the above-described example, a procedure for medical navigation based on the medical navigation system 10 is illustrated below in connection with a specific operation procedure. In the following description of the embodiments, the first relative position information and/or the second relative position information is described by taking as an example that the navigation module 200 itself determines. It should be understood that these embodiments are not intended to limit the types of orthopedic operations to which the medical navigation system 10 of the present application may be applied, and that the medical navigation system 10 of the present application may be equally applicable to other orthopedic operations in which similar processes may be performed based on the manner in which the medical navigation system 10 performs medical navigation during other orthopedic operations.
Fractures of the proximal femur (such as intertrochanteric fractures) or fractures of the diaphysis of long bones such as tibial humerus are often treated by means of intramedullary nail internal fixation to facilitate fracture healing. The choice of the position and direction of the intramedullary nail is important in the implementation of the whole operation for intramedullary nail fixation. This is because, advance nail position and advance nail direction and can influence the establishment of later stage expanded marrow passageway and intramedullary nail owner nail and put into position and direction, select improperly and can lead to the owner nail to be difficult to get into the medullary cavity, even can knock into the medullary cavity by force, also can lead to the owner nail to stress unevenly and warp great in the medullary cavity, destroy the fracture reduction effect easily this moment, the far end locking condition of locking inaccuracy takes place more easily when the owner nail is long owner nail, still can lead to fracture department delay healing even healing or abnormal healing when serious.
Taking the nail feeding point of the proximal intramedullary nail of the femur as an example, in clinic, the ideal nail feeding position is positioned at the vertex of the greater trochanter and near the central axis of the femoral neck, as shown in fig. 17, and the ideal nail feeding direction is 5 degrees outwards of the axis of the femoral shaft, so as to adapt to the external deflection angle of the main nail, as shown in fig. 18.
In the operation process, the nail feeding position and the nail feeding direction of the nail feeding point are usually confirmed by driving a guide needle, the needle feeding position of the guide needle is the nail feeding position of the later-stage main nail, and the needle feeding direction of the guide needle is the nail feeding direction of the later-stage main nail. To better select the insertion point, the present inventors have found that the existing way of driving the lead, to ensure that the driven lead is in a satisfactory insertion position and direction, or that the radioscopy is continuously performed throughout the insertion of the lead to view the position and direction of the needle tip of the lead in real time, does not necessarily increase the amount of radiation to which the patient and physician are exposed. Or a doctor firstly drives a guide pin into the nail feeding point through touching the vertex of the greater rotor, then carries out perspective, confirms the nail feeding position and the nail feeding direction according to the perspective image, if the nail feeding position and the nail feeding direction are not ideal, the doctor estimates the position and the angle to be adjusted through the perspective image, drives a second guide pin under the condition of not pulling out the first guide pin, and then carries out perspective checking on the nail feeding position and the nail feeding direction of the second guide pin again, and repeatedly carries out the steps until the perspective checking is carried out, and the driven guide pin is confirmed to reach the satisfactory nail feeding position and the satisfactory nail feeding direction. However, the guide pin which is driven for the first time according to subjective experience is generally difficult to reach an ideal nail feeding point of an expected intramedullary nail, the operation time is greatly prolonged in the later adjustment process according to the perspective result, the adjustment amount of the position and the angle cannot be accurately controlled, and meanwhile, the perspective times and the radiation amount of a patient and a doctor are increased.
Accordingly, embodiments of the present application provide a medical perspective device 100 and a medical navigation system 10 comprising the medical perspective device 100 and a navigation assembly 200. The medical fluoroscopy device 100 is provided with a visualization and localization assembly 110 or provides a visualization and localization function for obtaining medical images with location identification, and the navigation assembly 200 may be used for guiding the movement of an orthopaedic surgical tool, for example to a desired approach point of an intramedullary nail.
The navigation module 200 has sensors integrated therein, which may be a six-degree-of-freedom inertial sensor, for measuring the positional deviation (x, y, z) and angular deviation (α, β, γ) of the navigation module 200 with respect to a reference position. Definition of the own coordinate system of the sensor as shown in fig. 12, after setting the reference position information of the sensor, the sensor is moved, and the sensor can measure the relative position information of the real-time position of the sensor with respect to the reference position, that is, the displacement amounts (x, y, z) along 3 coordinate axes and the angular deviations (α, β, γ) of the current directions of the respective coordinate axes from the reference coordinate axis directions. Wherein, the sensor is integrated inside the navigation module 200, so that the reference position information of the sensor can be used as the reference position information of the navigation module 200, the real-time position information of the sensor can be used as the real-time position information of the navigation module 200, the reference position information determines the reference position of the sensor and/or the navigation module 200, and the real-time position information determines the real-time position of the sensor and/or the navigation module 200. Wherein in some embodiments the sensor may also measure the pitch angle of the bottom surface of the navigation assembly 200 relative to an absolute horizontal plane. In other embodiments, the navigation assembly 200 can also be integrated with a level meter by which the pitch angle of the bottom surface of the navigation assembly 200 relative to the absolute horizontal is measured.
In the navigation module 200 shown in fig. 11 and 13, the navigation module 200 is provided with a digital display screen (hereinafter referred to as a screen) on which 3 numerals are displayed, the symbol "Δ" indicating a displacement deviation symbol for indicating that a value adjacent thereto is a displacement deviation (for example, a first displacement deviation or a second displacement deviation as described above), and the symbol "°" indicating that a value corresponding thereto is an angle for indicating that a value adjacent thereto is an angle deviation (for example, a first angle deviation or a second angle deviation as described above) or a pitch angle.
For example, in fig. 13, "Δ3" indicates that the first displacement deviation of the real-time position of the sensor and/or navigation assembly 200 from the reference position is 3, or that the second displacement deviation of the real-time position of the sensor and/or navigation assembly 200 from the intended insertion position (e.g., the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail) is 3. "18" means that the first angular deviation of the real-time position of the sensor and/or navigation assembly 200 from the reference position is 18, or that the second angular deviation of the real-time position of the sensor and/or navigation assembly 200 from the intended insertion direction (e.g. the ideal feed direction of the ideal feed point of an intramedullary nail) is 18, and "4" means that the pitch angle is 4.
The first displacement deviation, the second displacement deviation, the first angle deviation, and the second angle deviation may be determined in combination with a coordinate system of the sensor and/or the navigation module 200 when the sensor and/or the navigation module is located at the reference position and the real-time position. The coordinate system of the sensor at the reference position is referred to as a reference coordinate system (hereinafter referred to as a reference coordinate system), and the coordinate system of the sensor at the real-time position is referred to as a reference coordinate system (hereinafter referred to as a current coordinate system), the first displacement deviation, the second displacement deviation, the first angle deviation and the second angle deviation can be obtained by calculating the position deviation and the angle deviation of the current coordinate system' relative to the reference coordinate system ψ.
For example, taking the first displacement deviation as an example, the projection of the displacement of the origin of the current coordinate system ψ' relative to the origin of the reference coordinate system ψ on the Z axis of the reference coordinate system ψ is taken as a height difference Δh, which is the above-mentioned first displacement deviation as shown in fig. 14; the angle between the projection of the X axis of the current coordinate system ψ', i.e. the coordinate axis in which the axis direction of the navigation module 200 is located, on the XOY plane of the reference coordinate system ψ and the X axis of the reference coordinate system is taken as the external offset angle, which is the first angle deviation as shown in fig. 15. As another example, a pitch angle of the plane of the bottom surface of the navigation module 200 with respect to the absolute horizontal plane is taken as a pitch angle, wherein the deviation is positive when the tail (narrow end) is tilted and negative when the head (wide end) is tilted, as shown in fig. 16.
Since the intramedullary nails at different positions, such as femur, tibia, humerus intramedullary nail and other positions, have similar nail feeding points, the proximal femur intramedullary nail operation will be described in detail below as an example.
The patient's usual position in the proximal femoral intramedullary nail procedure is in a supine position, and the navigation assembly 200 is placed in a reference position, such as on the patient's proximal femoral surface, after the skin incision and prior to driving the guide pin.
After the navigation assembly 200 is placed at the reference position, reference position information for the reference position may be recorded.
In some embodiments, the reference position information may be recorded by responding to a trigger instruction by way of the user issuing the trigger instruction. For example, a medical professional (e.g., a doctor) enters a trigger instruction via the navigation assembly 200. The navigation module 200 receives the trigger command and records the position information measured by the sensor when the trigger command is received as reference position information. After receiving the trigger instruction, the navigation module 200 may initialize the sensor first, and record the initialized position information measured by the sensor (i.e., the initial position information of the sensor) as the reference position information.
The reference position information recorded may include a reference height of the sensor with respect to a horizontal plane, and a specified direction of the sensor (e.g., an axial direction of the navigation module 200). Taking the reference position information as zero position information as an example, the height of the sensor output is 0 value at this time. If the navigation module 200 prompts the first relative position information in real time, the height difference Δh on the screen of the navigation module 200 is displayed as 0, the external deflection angle is displayed as 0 °, and the pitch angle is close to 0 °.
With the navigation assembly 200 placed in this reference position, a medical professional takes an orthographic view of the proximal femur using the medical fluoroscopy device 100 to obtain a medical image in which the site to be operated on, such as the proximal femur, and at least a portion of the structure (e.g., axial direction) of the navigation assembly 200 are visualized in the medical image.
The imaging positioning component may be placed on a perspective path between a radiation transmitting end and a radiation receiving end of the medical perspective device 100, and then photographed by using the medical perspective device 100, so as to obtain a medical image with a position identifier. Or after the original medical image (i.e. the medical image without the position identifier) is obtained by shooting by the medical perspective device 100, the position identifier is superimposed on the original medical image by the developing and positioning function provided by the medical perspective device 100, so as to obtain the medical image with the position identifier.
After the medical image is obtained, an angle identifier parallel to the expected insertion direction (e.g., the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail) may be selected from the position identifiers of the medical image based on the medical image, an angle identifier parallel to the axial direction of the navigation module 200 may be selected, and an angle value X between the two angle identifiers may be determined, where the angle value X is also an included angle θ between the axial direction of the navigation module 200 and the expected insertion direction (e.g., the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail), and the included angle θ is the target offset position information between the target position (e.g., the ideal nail feeding point of the intramedullary nail) and the reference position of the sensor.
Because the doctor clearly knows the position where the bone surgery is expected to be performed when the bone surgery is performed, for example, the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail is positioned at the vertex of the greater trochanter and is close to the central axis of the femoral neck, and the ideal nail feeding direction is 5 degrees outwards of the axis of the femoral shaft so as to adapt to the external deflection angle of the main nail.
In some embodiments, a doctor can quickly determine an ideal nail feeding direction in a medical image by looking at the medical image, and then can find an angle mark parallel to the ideal nail feeding direction, and can determine the axial direction of the navigation assembly 200 by combining with the displayed navigation assembly 200, and then find an angle mark parallel to the axial direction of the navigation assembly 200, and determine an angle value X between the two angle marks, wherein the angle value X is also an included angle θ between the axial direction of the navigation assembly 200 when in a reference position and the ideal nail feeding direction of an ideal nail feeding point of an intramedullary nail, so that the included angle θ can be quickly and conveniently obtained.
In other embodiments, a doctor can quickly determine the axis of the femoral shaft in the medical image by looking at the medical image, and then can find the angle mark parallel to the axis of the femoral shaft, and can determine the axis direction of the navigation assembly 200 by combining the displayed navigation assembly 200, and then find the angle mark parallel to the axis direction of the navigation assembly 200, and determine the angle value X between the two angle marks, and the angle value X, plus the angle 5 ° between the axis of the femoral shaft and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, can obtain the angle θ.
Therefore, through a direct observation mode of a doctor, the included angle theta between the expected insertion direction (such as the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail) and the axial direction of the navigation assembly 200 when the navigation assembly 200 is positioned at the reference position can be determined without complex image processing, and the target offset position information can be obtained. As shown in fig. 21, when the position mark parallel to the axis of the femoral shaft is observed and determined as the angle mark 202, the position mark parallel to the axis direction when the navigation module 200 is at the reference position is the angle mark 201, and when the angle between two adjacent angle marks is 5 °, for example, the angle X between the angle mark 201 and the angle mark 202 is 20 °, and the value of the angle θ can be determined to be 25 °.
In some embodiments, the desired insertion direction (e.g., the desired approach direction of the desired approach point of the intramedullary nail) and the angle θ may also be determined by medical personnel via a marker measurement on the medical image.
For example, in some embodiments, the intended insertion direction (e.g., the ideal feed direction of the ideal intramedullary nail feed point) may be manually marked on the medical image by a medical professional (e.g., a physician) and an angle indicator parallel to the marked ideal feed direction of the ideal intramedullary nail feed point, and an angle indicator parallel to the axis direction of the navigation assembly 200, may be measured and obtained using a ruler or other measuring tool capable of determining parallelism between lines, and the angle X between the two angle indicators may be determined as the angle θ between the axis direction of the navigation assembly 200 when in the reference position and the ideal feed direction of the ideal intramedullary nail feed point.
For another example, a medical staff (such as a doctor) can manually mark the axis of the femoral shaft in a medical image in a mechanical drawing manner, and measure and obtain an angle mark parallel to the marked axis of the femoral shaft and an angle mark parallel to the axis direction of the navigation assembly 200 by using a ruler or other measuring tools capable of determining parallelism between lines, wherein the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail at the proximal end of the femur is deviated 5 ° outwards from the axis of the femoral shaft so as to adapt to the external deviation angle of the main nail, and therefore, an included angle obtained by adding an angle X between the two angle marks by 5 ° is an included angle theta between the axis direction of the navigation assembly 200 when the navigation assembly is at a reference position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
For another example, the obtained medical image is imported into an image workstation, the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail in the medical image and the axial direction of the navigation component 200 are automatically identified by utilizing an image processing method in the image workstation, an angle mark parallel to the ideal nail feeding direction of the identified ideal nail feeding point of the intramedullary nail and an angle mark parallel to the axial direction of the navigation component 200 are analyzed and determined from all angle marks in the medical image, and an angle X between the two angle marks is determined as an included angle theta between the axial direction of the ideal nail feeding point of the intramedullary nail when the navigation component 200 is at a reference position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
For another example, the obtained medical image is led into an image workstation, the image processing method is utilized in the image workstation, the axis of the femoral shaft in the medical image and the axis direction of the navigation component 200 are automatically identified, the angle identification parallel to the identified femoral shaft axis and the angle identification parallel to the axis direction of the navigation component 200 are analyzed and determined from all the angle identifications in the medical image, the angle X between the two angle identifications and the angle (such as 5 degrees) between the axis of the femoral shaft and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail are combined, the angle theta between the axis direction of the navigation component 200 at the reference position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail is determined, for example, the angle X is added by 5 degrees, namely the angle theta between the axis direction of the navigation component 200 at the reference position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
The image processing method for identifying the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail and/or the axis of the femoral shaft in the medical image can be realized by adopting a pattern recognition algorithm or a deep learning intelligent algorithm, and the embodiment of the application is not particularly limited.
After obtaining the target offset position information, for example, the included angle θ between the axial direction of the navigation module 200 at the reference position and the ideal screw feeding direction of the ideal screw feeding point of the intramedullary nail, taking the medical staff (such as a doctor) to observe and determine the included angle θ by himself, if the first relative position information is prompted during the medical navigation, the medical staff (such as the doctor) can record the included angle θ by himself or can input the included angle θ into the navigation module 200. If the second relative position information is presented during medical navigation, the healthcare worker needs to input the included angle θ into the navigation assembly 200.
In some embodiments, taking the example of determining the included angle θ by way of image processing, the device performing the image processing may be the navigation assembly 200 or the medical fluoroscopy device 100 itself, or may be a device different from the navigation assembly 200 and the medical fluoroscopy device 100.
If the first relative position information is prompted in the medical navigation process, the device for performing image processing can prompt the included angle theta so that a medical staff (such as a doctor) can know the included angle theta. If the device for performing image processing is different from the navigation module 200 and the medical fluoroscopy device 100, the device for performing image processing may transmit the included angle θ to the navigation module 200 in a wired or wireless manner, or may provide the medical staff with knowledge and then input the included angle θ into the navigation module 200.
If the second relative position information is prompted during the medical navigation and the device performing image processing is different from the navigation module 200 and the medical fluoroscopy device 100, the device performing image processing needs to provide the included angle θ to the navigation module 200, for example, the device performing image processing transmits the included angle θ to the navigation module 200 in a wired or wireless manner, or the device performing image processing may provide the included angle θ to the medical staff for knowing and then input the included angle θ to the navigation module 200 by the medical staff.
The navigation assembly 200 is then removed from the proximal surface of the patient's femur on the affected side and the navigation assembly 200 is secured to an orthopedic tool, such as a guide pin or holder or drill, as exemplified by a guide pin in the embodiments described below.
Taking a manner of inserting a guide pin by bare hands as an example, a schematic diagram of fixing the navigation module 200 to the guide pin is shown in fig. 22, where a groove may be formed at the bottom of the navigation module 200 to directly fix the navigation module 200 to the guide pin, or the navigation module 200 may be fixed to the guide pin by other manners. If a holder or a drill is selected for inserting the guide pin, the navigation module 200 can be directly mounted on the holder or the drill by magnetic attraction, so as to fix the navigation module 200 to the holder or the drill, and a schematic diagram of fixing the navigation module 200 to the holder or the drill is shown in fig. 23.
The patient lies on his back with the apex ridge contour of the greater trochanter of the patient in a generally vertical orientation, which can be manually touched by the physician from the incision, as shown in phantom in the enlarged portion of fig. 24. The doctor holds the guide pin fixed with the navigation component 200 and enables the axial direction of the guide pin to be approximately positioned on the horizontal plane, and the pitch angle value on the screen of the navigation component 200 can assist the doctor to check whether the guide pin is approximately kept horizontal.
The physician then moves the orthopaedic surgical tool with the navigation assembly 200 secured thereto, in combination with the prompting of the navigation assembly 200, to move the orthopaedic surgical tool to the desired intramedullary nail insertion point.
During movement of the orthopaedic surgical tool, the sensors of the navigation assembly 200 measure in real-time to obtain real-time location information.
Taking the example of navigating the orthopaedic surgical tool to the ideal intramedullary nail insertion point by prompting the first relative position information, the following is illustrated in connection with the navigation process of the navigation assembly 200 as described above.
During movement of the orthopaedic surgical tool, the navigation assembly 200 determines a height difference ΔH of the real-time position of the sensor relative to the reference position based on the real-time position information measured by the sensor.
The tip of the guide needle is placed on the vertex of the ridge profile of the vertex of the large rotor closest to the abdomen side, and the navigation module 200 determines the height difference Δh1 of the real-time position of the sensor with respect to the reference position based on the real-time position information measured by the sensor (if the navigation module 200 performs the zeroing process on the sensor, the height difference Δh1 may be the first reference height value H1 in practice).
The tip of the guide needle is placed on the vertex of the ridge profile of the vertex of the large rotor closest to the back side, and at this time, the navigation component 200 determines, based on the real-time position information measured by the sensor, the height difference Δh2 of the real-time position of the sensor with respect to the reference position (if the navigation component 200 performs the zeroing processing on the sensor, the height difference Δh2 may actually be the second reference height value H2).
The physician holds the lead approximately horizontally with its tip moving on the greater trochanter apex ridge profile and the difference in height Δh displayed in real time by the navigation assembly 200 will vary from difference in height Δh1 to difference in height Δh2. If the navigation module 200 performs the zeroing process on the sensor, the height H actually indicated in real time will change between the height H1 and H2.
If the doctor selects the midpoint of the large rotor vertex ridge profile as the operation position point, when the indicated height difference Δh is (Δh1+Δh2)/2, or when the sensor is zeroed, the indicated height value H is (h1+h2)/2, the position of the midpoint of the large rotor vertex ridge profile is the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail. Similarly, if the doctor selects the front 1/3 of the ridge profile of the greater trochanter vertex as the operation position point, when the indicated height difference Δh is (Δh2- Δh1)/3+Δh1 (if the height difference is displayed as an integer value, the height difference may be rounded up or down, and the same applies to the following embodiments), or when the sensor is zeroed, the indicated height value H is (H2-H1)/3+h1, which is the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail.
In this way, the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail can be automatically judged and selected by a doctor according to the actual position requirement, so that the method can be suitable for selecting different specific requirements of different nail feeding positions of intramedullary nail products.
After the guide pin reaches the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail, the angle can be adjusted so that the needle feeding direction of the guide pin reaches the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
During adjustment, a doctor keeps the ideal nail feeding position of the needle guiding tip at the ideal nail feeding point of the intramedullary nail unchanged, and adjusts the outward deflection angle of the guide needle. The navigation component 200 determines and prompts an external offset angle of the real-time position of the sensor relative to the reference position based on the real-time position information measured by the sensor. In the process that the doctor keeps the ideal nail feeding position of the needle guiding tip at the ideal nail feeding point of the intramedullary nail unchanged and adjusts the outward deflection angle of the guide needle, when the outward deflection angle prompted by the navigation component 200 is the determined included angle theta between the axis direction of the navigation component 200 and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, the needle feeding direction of the guide needle reaches the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, as shown in fig. 25. The doctor can refer to the pitch angle of the prompt, and insert the guide pin after the front dip angle of the guide pin is properly adjusted.
Taking the example of navigating the orthopaedic surgical tool to the ideal intramedullary nail insertion point by prompting the second relative positional information, the following is illustrated in connection with the navigation process of the navigation assembly 200 as described above.
The navigation assembly 200 prompts the height value in the real-time position information measured in real-time by the sensor during the movement of the orthopaedic surgical tool. The tip of the guide needle is placed on the vertex of the ridge profile of the vertex of the greater trochanter closest to the ventral side, in some embodiments, the doctor sends a recording instruction by operating the navigation assembly 200 or inputting voice, etc., the navigation assembly 200 records the first reference height value h1 measured by the sensor at the moment, in other embodiments, the navigation assembly 200 prompts the height value, and the medical staff (such as doctor) can self-record the first reference height value h1 measured by the sensor at the moment.
Placing the tip of the guide needle on the apex of the ridge profile of the apex of the greater trochanter closest to the back side, in some embodiments, the physician issues a recording instruction by operating the navigation assembly 200 or voice input, etc., the navigation assembly 200 records the second reference height value h2 measured by the sensor at this time. In other embodiments, the navigation assembly 200 prompts the height value, and medical personnel (e.g., a doctor) can self-remember the second reference height value h2 measured by the sensor at that time.
Subsequently, in the case where the navigation assembly 200 records the first reference height value H1 and the second reference height value H2, the navigation assembly 200 determines a height value H of an ideal nail feeding position of an ideal nail feeding point of the intramedullary nail in a direction perpendicular to the horizontal plane based on the first reference height value H1 and the second reference height value H2, and records the height value H. In the case that the medical staff (e.g., doctor) self-records the first reference height value H1 and the second reference height value H2, the medical staff (e.g., doctor) self-determines the height value H of the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail in the direction perpendicular to the horizontal plane based on the self-recorded first reference height value H1 and the second reference height value H2, and then inputs the determined height value H into the navigation assembly 200 to be recorded by the navigation assembly 200.
If the doctor selects the middle point of the peak ridge outline of the greater trochanter as the operation position point, the height value H of the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail is (h1+h2)/2. Similarly, if the doctor selects the position 1/3 of the anterior position of the ridge profile of the apex of the greater trochanter as the operation position point, the height value H of the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail is (H2-H1)/3+h1.
The physician holds the lead approximately horizontally with its tip moving over the greater trochanter apex ridge profile, and during this movement, the navigation assembly 200 combines the real-time positional information of the sensor with the recorded height value H to determine and alert the second displacement offset.
And in the navigation process, when the second displacement deviation prompted in real time is 0, determining an ideal nail feeding position reaching an ideal nail feeding point of the intramedullary nail.
In this way, after the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail is selected and determined, when the second displacement deviation prompted in real time is 0 in the process of moving the orthopedic operation tool, the ideal nail feeding position reaching the ideal nail feeding point of the intramedullary nail can be determined intuitively and conveniently.
After the guide pin reaches the ideal nail feeding position of the ideal nail feeding point of the intramedullary nail, the angle can be adjusted so that the needle feeding direction of the guide pin reaches the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail.
During adjustment, a doctor keeps the ideal nail feeding position of the needle guiding tip at the ideal nail feeding point of the intramedullary nail unchanged, and adjusts the outward deflection angle of the guide needle. The navigation component 200 determines, in real time, a second angular deviation of the real-time position of the sensor relative to the target position based on the real-time position information measured by the sensor, that is, an included angle between a specified direction (for example, an axial direction of the navigation component 200) and an ideal nail feeding direction of an ideal nail feeding point of the intramedullary nail when the sensor is at the real-time position, and prompts the second angular deviation.
Wherein, in the process that the doctor keeps the ideal nail feeding position of the needle guiding tip at the ideal nail feeding point of the intramedullary nail unchanged and adjusts the external deflection angle of the guide needle, when the second angle deviation prompted by the navigation component 200 is 0, the needle feeding direction of the guide needle reaches the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail, and the doctor can refer to the prompted pitch angle to adjust the forward inclination angle of the guide needle appropriately and then insert the guide needle.
Based on the medical navigation system 10 provided by the embodiments of the present application, the first height deviation and/or the first angle deviation and/or the second height deviation and/or the second angle deviation of the real-time position to the reference position are/is followed and prompted based on the sensor, so that the selection of the ideal nail feeding position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail is realized, the accurate positioning of the ideal nail feeding position and the ideal nail feeding direction of the ideal nail feeding point of the intramedullary nail during operation is fundamentally realized, and real-time position information and angle information are provided for doctors, so that the guide needle can be driven successfully once without driving the guide needle for multiple times. Moreover, the navigation module 200 provided by the embodiments of the present application is small and easy to use, can greatly shorten the operation time, reduce the X-ray radiation amount of doctors and patients, and improve the operation quality and operation efficiency.
Based on the above embodiments, the present application further provides a medical image processing method, as shown in fig. 26, where the medical image processing method in some embodiments includes:
step S101: a medical image of a presence location identifier is acquired, the medical image of the presence location identifier being an image obtained when the navigation assembly is in a reference position, the medical image of the presence location identifier comprising an image of the surgical site and an image of at least a portion of the navigation assembly.
Step S102: based on the position identification in the medical image where the position identification exists, target offset position information, which is information indicating a relative positional relationship between the reference position where the navigation component is located and the target position, is obtained.
The target offset position information is used for combining the real-time position information of the navigation assembly fixed with the orthopedic operation tool to determine second relative position information of the real-time position of the navigation assembly relative to the target position, and the second relative position information can be used for assisting in guiding the orthopedic operation tool to move to the target position, and the real-time position is determined by the real-time position information; and/or, target offset position information for assisting in guiding the orthopaedic surgical tool to move to the target position in combination with first relative position information of the real-time position of the navigation assembly fixed with the orthopaedic surgical tool relative to the reference position.
For a specific implementation manner of the medical image processing method, reference may be made to the description of the medical perspective device and/or the medical navigation system in the foregoing embodiments, which is not repeated herein.
It should be understood that, although the steps in the flowcharts referred to above are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts may include a plurality of steps or stages that are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the steps or stages is not necessarily sequential, but may be performed in rotation or alternately with at least a portion of the steps or stages in other steps or other steps.
In one embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 27. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a medical image processing method as described above. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
In an embodiment, a computer device is also provided, comprising a memory and a processor, the memory having stored therein a computer program, which processor, when executing the computer program, implements the steps of the medical image processing method described above.
In an embodiment, a computer readable storage medium is provided, storing a computer program which, when executed by a processor, implements the steps of the medical image processing method described above.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device performs the steps in the medical image processing method described above.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, the processor executes the computer instructions, causing the computer device to perform the steps of the medical image processing method of any of the embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A medical perspective device, wherein the medical perspective device is configured to generate a medical image in which a location identity exists;
the medical perspective device includes a visualization positioning assembly and/or the medical perspective device provides a visualization positioning function;
the development positioning component comprises a position mark which can be visually displayed in a medical image, and is positioned on a perspective path between a ray transmitting end and a ray receiving end of the medical perspective device in the perspective process of the medical perspective device, and the medical perspective device obtains the medical image with the position mark through perspective;
And the developing and positioning function is to superimpose a position mark on the obtained original medical image by the medical perspective equipment to obtain the medical image with the position mark.
2. The device of claim 1, wherein the location identity comprises an angle identity and/or a distance identity;
the developing and positioning assembly and/or the medical image with the position mark comprise a plurality of angle marks, wherein the angle marks comprise reference angle marks and non-reference angle marks arranged on at least one side of the reference angle marks;
the developing and positioning assembly and/or the medical image with the position marks comprise at least one group of distance marks, and a plurality of distance marks in the same group of distance marks are equidistantly arranged.
3. The apparatus of claim 2, wherein the reference angle indicator is disposed perpendicular to the distance indicator.
4. The apparatus of claim 2, wherein said imaging positioning assembly and/or said medical image of said presence-location indicator comprises two sets of said distance indicators, said two sets of said distance indicators being arranged vertically.
5. The apparatus of any one of claims 1 to 4, comprising any one of:
the development positioning component is arranged at the ray emission end of the medical perspective equipment;
the development positioning component is arranged at the ray receiving end of the medical perspective equipment;
the ray receiving end of the medical perspective equipment comprises an image intensifier, and the position mark of the developing and positioning assembly is positioned on an image input screen of the image intensifier.
6. The apparatus according to any one of claims 1 to 4, wherein the visualization positioning function is configured to superimpose a preset layer on the original medical image for the medical fluoroscopy apparatus, and obtain a medical image with the presence-location identifier; wherein, there is a position mark in the preset layer.
7. A medical navigation system, comprising: a navigation assembly, a medical fluoroscopy device according to any one of claims 1 to 6;
at least part of the navigation component is capable of being visually visualized in a medical image;
a medical image of a presence location identifier generated by the medical perspective device, the medical image of the presence location identifier comprising an image of a surgical site and an image of at least a portion of the navigation assembly, the medical image being an image obtained when the navigation assembly is in a reference position; the position identifier is used for determining target offset position information, and the target offset position information is used for indicating the relative position relationship between the reference position and the target position;
The navigation component is used for prompting the first relative position information and/or the second relative position information; the first relative position information is the relative position information between the real-time position of the navigation component and the reference position, and the real-time position is determined by the real-time position information; the second relative position information is the relative position information between the real-time position and the target position, and is determined by the real-time position information and the target offset position information;
the first relative position information, in combination with the target offset position information, and/or the second relative position information, may be used to assist in guiding an orthopaedic surgical tool secured with the navigation assembly to move to the target location.
8. The system of claim 7, wherein the target location comprises an ideal intramedullary nail insertion point.
9. A medical image processing method, the method comprising:
acquiring a medical image with a position mark, wherein the medical image with the position mark is an image obtained when a navigation component is at a reference position, and comprises an image of a surgical site and at least part of the image of the navigation component;
Obtaining target offset position information based on a position identifier in the medical image with the position identifier, wherein the target offset position information is information indicating a relative position relationship between a reference position where the navigation component is located and a target position;
the target offset position information is used for combining the real-time position information of the navigation component fixed with the orthopedic operation tool to determine second relative position information of the real-time position of the navigation component relative to the target position, the second relative position information can be used for assisting in guiding the orthopedic operation tool to move to the target position, and the real-time position is determined by the real-time position information; and/or the target offset position information is used for combining first relative position information of the real-time position of the navigation component fixed with the orthopedic operation tool relative to the reference position, and assisting in guiding the orthopedic operation tool to move to the target position.
10. A computer device comprising a processor and a memory, the memory storing a computer program, characterized in that the computer program, when executed by the processor, causes the processor to carry out the steps of the method of claim 9.
CN202211549627.3A 2022-03-04 2022-12-05 Medical perspective device, medical navigation system, and medical image processing method Pending CN116687437A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CNPCT/CN2022/079413 2022-03-04
CN2022079413 2022-03-04

Publications (1)

Publication Number Publication Date
CN116687437A true CN116687437A (en) 2023-09-05

Family

ID=87524146

Family Applications (5)

Application Number Title Priority Date Filing Date
CN202280007392.7A Pending CN116887776A (en) 2022-03-04 2022-11-09 Medical navigation device, navigation processing device and method, and medical navigation system
CN202211399149.2A Pending CN116740309A (en) 2022-03-04 2022-11-09 Medical image processing system, medical image processing method and computer equipment
CN202222980572.3U Active CN219501199U (en) 2022-03-04 2022-11-09 Developing structure and bone surgery tool navigation device
CN202280007393.1A Pending CN116887775A (en) 2022-03-04 2022-11-09 Medical navigation device, navigation processing device and method, and medical navigation system
CN202211549627.3A Pending CN116687437A (en) 2022-03-04 2022-12-05 Medical perspective device, medical navigation system, and medical image processing method

Family Applications Before (4)

Application Number Title Priority Date Filing Date
CN202280007392.7A Pending CN116887776A (en) 2022-03-04 2022-11-09 Medical navigation device, navigation processing device and method, and medical navigation system
CN202211399149.2A Pending CN116740309A (en) 2022-03-04 2022-11-09 Medical image processing system, medical image processing method and computer equipment
CN202222980572.3U Active CN219501199U (en) 2022-03-04 2022-11-09 Developing structure and bone surgery tool navigation device
CN202280007393.1A Pending CN116887775A (en) 2022-03-04 2022-11-09 Medical navigation device, navigation processing device and method, and medical navigation system

Country Status (2)

Country Link
CN (5) CN116887776A (en)
WO (2) WO2023165157A1 (en)

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165659B2 (en) * 2006-03-22 2012-04-24 Garrett Sheffer Modeling method and apparatus for use in surgical navigation
US9566123B2 (en) * 2011-10-28 2017-02-14 Navigate Surgical Technologies, Inc. Surgical location monitoring system and method
US11857149B2 (en) * 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
CN103211655B (en) * 2013-04-11 2016-03-09 深圳先进技术研究院 A kind of orthopaedics operation navigation system and air navigation aid
KR101817438B1 (en) * 2016-09-13 2018-01-11 재단법인대구경북과학기술원 A surgical navigation system for total hip arthroplasty
WO2018075784A1 (en) * 2016-10-21 2018-04-26 Syverson Benjamin Methods and systems for setting trajectories and target locations for image guided surgery
CN109925027B (en) * 2017-12-15 2023-06-13 天臣国际医疗科技股份有限公司 Circumcision anastomat
CN110584784B (en) * 2018-06-13 2021-02-19 武汉联影智融医疗科技有限公司 Robot-assisted surgery system
AU2020377135A1 (en) * 2019-10-28 2022-05-19 Waldemar Link Gmbh & Co. Kg System and method for computer-aided surgical navigation implementing 3D scans
EP3815643A1 (en) * 2019-10-29 2021-05-05 Think Surgical, Inc. Two degree of freedom system
CN112288742B (en) * 2019-12-31 2021-11-19 无锡祥生医疗科技股份有限公司 Navigation method and device for ultrasonic probe, storage medium and electronic equipment
CN112001889A (en) * 2020-07-22 2020-11-27 杭州依图医疗技术有限公司 Medical image processing method and device and medical image display method
CN112053400B (en) * 2020-09-09 2022-04-05 北京柏惠维康科技有限公司 Data processing method and robot navigation system

Also Published As

Publication number Publication date
CN116887775A (en) 2023-10-13
CN116887776A (en) 2023-10-13
WO2023165158A1 (en) 2023-09-07
CN116740309A (en) 2023-09-12
WO2023165157A1 (en) 2023-09-07
CN219501199U (en) 2023-08-11

Similar Documents

Publication Publication Date Title
EP4159149A1 (en) Surgical navigation system, computer for performing surgical navigation method, and storage medium
US11612402B2 (en) Method and apparatus for treating a joint, including the treatment of cam-type femoroacetabular impingement in a hip joint and pincer-type femoroacetabular impingement in a hip joint
US6718194B2 (en) Computer assisted intramedullary rod surgery system with enhanced features
US9204938B2 (en) Visualization guided ACL localization system
JP4469423B2 (en) Stereotaxic treatment apparatus and method
JP2016512973A (en) Tracking device for tracking an object relative to the body
US20140180064A1 (en) A system for measuring the true dimensions and orientation of objects in a two dimensional image
CN108720924B (en) Improving registration of anatomical images to a position tracking coordinate system based on visual proximity to bone tissue
JP2007518521A (en) System and method for minimally invasive incision
US20230255691A1 (en) Adaptive Positioning Technology
EP3783568A2 (en) Systems and methods of fluoro-ct imaging for initial registration
JP6869715B2 (en) Confirmation of position and orientation for visualizing the tool
JP2023532066A (en) A system for assisting a user in placing a penetrating device within tissue
CN116687437A (en) Medical perspective device, medical navigation system, and medical image processing method
KR101726800B1 (en) Orthopedic surgical guide apparatus and surgical quantification method using the same
US7340291B2 (en) Medical apparatus for tracking movement of a bone fragment in a displayed image
EP4375929A1 (en) Systems and methods for registration of coordinate systems based on 2d x-ray imaging and augmented reality device
KR20000011134A (en) Stereotactic surgical procedure apparatus and method
WO2023135491A1 (en) X-wing enhanced guidance system for distal targeting
CN117982159A (en) Navigation system, method, medical fluoroscopy device and computer storage medium
CN116439828A (en) Distal guiding method of femur neck fracture surgical robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication