CN112043381A - Operation navigation system and identification method thereof - Google Patents

Operation navigation system and identification method thereof Download PDF

Info

Publication number
CN112043381A
CN112043381A CN202010859848.5A CN202010859848A CN112043381A CN 112043381 A CN112043381 A CN 112043381A CN 202010859848 A CN202010859848 A CN 202010859848A CN 112043381 A CN112043381 A CN 112043381A
Authority
CN
China
Prior art keywords
visual
navigation system
navigator
error
error control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010859848.5A
Other languages
Chinese (zh)
Inventor
王利峰
刘洪澎
沈晨
孙贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yake Wisdom Beijing Technology Co ltd
Original Assignee
Yake Wisdom Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yake Wisdom Beijing Technology Co ltd filed Critical Yake Wisdom Beijing Technology Co ltd
Priority to CN202010859848.5A priority Critical patent/CN112043381A/en
Publication of CN112043381A publication Critical patent/CN112043381A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The invention relates to the technical field of computer vision, and provides a surgical navigation system and an identification method thereof. The surgical navigation system includes: the visual mark comprises a visual mark matrix, wherein visual characteristic points are arranged on the outer side surface of the visual mark matrix along the axial direction and the circumferential direction of the visual mark matrix; the visual navigator is provided with an error control coding module, and the visual mark matrix is electrically connected with the error control coding module. The visual characteristic points are uniformly arranged along the axial direction and the circumferential direction of the visual marking matrix, the visual navigator detects the space coordinates of the visual characteristic points in the visual field range in real time, and the error control coding module carries out error correction coding processing on the detected space coordinates, so that the accurate space position and posture of the visual marking matrix are identified, and the detection precision and reliability are improved.

Description

Operation navigation system and identification method thereof
Technical Field
The invention relates to the technical field of computer vision, in particular to a surgical navigation system and an identification method thereof.
Background
With the development of computer technology, the surgical navigation brings revolutionary changes to the surgical operation with the advantages of precision, flexibility, minimal invasion and the like, and leads modern medicine to move from intuition medical treatment to precision medical treatment. Intraoperative guidance based on visual navigation technology can assist a doctor in performing surgery accurately according to a predetermined surgical plan, and can reduce the risk of accidental injury to adjacent important anatomical structures to the maximum extent.
The key step of the visual navigation technology is to accurately and reliably identify the spatial position and posture of a visual marker, and a visual or electromagnetic navigation instrument is generally used as a key detection component in a surgical navigation system to detect the spatial position and posture of the visual marker in the visual field range. When in use, the vision sensor is influenced by environmental factors such as illumination conditions, electromagnetic interference, vision shielding and the like, so that navigation deviation can be caused, the operation precision is influenced, and even medical accidents are caused. The visual marker structure in the traditional operation navigation system is too simple, redundant information is not provided, the visual marker is easily influenced by the environment when being identified, and the reliability is not high enough.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides an operation navigation system to solve the problems of low detection precision and poor reliability of the existing visual navigator.
The invention also provides an identification method of the surgical navigation system.
A surgical navigation system according to an embodiment of a first aspect of the present invention includes:
the visual mark comprises a visual mark matrix, wherein visual characteristic points are arranged on the outer side surface of the visual mark matrix along the axial direction and the circumferential direction of the visual mark matrix;
the visual navigator is provided with an error control coding module, and the visual mark matrix is electrically connected with the error control coding module.
According to one embodiment of the invention, the outer side of the visual marking substrate is provided with a groove, and the visual feature points are detachably arranged in the groove.
According to one embodiment of the invention, a magnet is arranged in the groove, and the mounting end of the visual feature point is in adsorption fit with the magnet.
According to one embodiment of the invention, the detection end of the visual feature point is provided with a light reflecting member.
According to one embodiment of the invention, the detection end of the visual feature point is provided with a countersunk hole, and the light reflecting piece is arranged in the countersunk hole.
According to one embodiment of the present invention, a surgical instrument is further included, the surgical instrument being coupled to the visual indicia substrate.
According to one embodiment of the invention, the visual navigator further comprises a display screen electrically connected with the error control coding module.
According to a second aspect of the invention, the surgical navigation system identification method comprises the following steps:
arranging the visual mark matrix at a position to be detected of a human body;
acquiring spatial coordinate point information of the visual feature points in a visual coordinate system through the visual navigator;
carrying out error correction coding processing on the acquired information of the visual navigator through the error control coding module;
and displaying the spatial position and the posture of the visual mark matrix under the visual coordinate system calculated by the error control coding module through the visual navigator.
According to an embodiment of the invention, in the step of performing error correction coding processing on the acquired information of the visual navigation instrument by the error control coding module, the error control coding module filters all coordinate point information to form an information code group by more than three collinear coordinate points.
According to an embodiment of the present invention, in the step of performing error correction coding processing on the acquired information of the visual navigator by using the error control coding module, a supervisory code is added to an information code group by using the error control coding module, and the error correction coding processing is performed.
One or more technical solutions in the embodiments of the present invention have at least one of the following technical effects:
the visual characteristic points are uniformly arranged along the axial direction and the circumferential direction of the visual marking matrix, the visual navigator detects the space coordinates of the visual characteristic points in the visual field range of the visual navigator in real time, and the error control coding module carries out error correction coding processing on the detected space coordinates, so that the accurate space position and posture of the visual marking matrix are identified, and the detection precision and reliability are improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a surgical navigation system according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a visual marker matrix in the surgical guidance system according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a visual feature point in the surgical navigation system according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating point set matching in the identification method of the surgical navigation system according to the embodiment of the present invention.
Reference numerals:
100. a visual indicia substrate; 110. visual feature points; 120. a groove; 130. a magnet; 140. a light reflecting member; 150. a countersunk hole; 200. a visual navigator; 300. a surgical instrument.
Detailed Description
The embodiments of the present invention will be described in further detail with reference to the drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
In the description of the embodiments of the present invention, it should be noted that the terms "center", "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the embodiments of the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the embodiments of the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the embodiments of the present invention, it should be noted that, unless explicitly stated or limited otherwise, the terms "connected" and "connected" are to be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. Specific meanings of the above terms in the embodiments of the present invention can be understood in specific cases by those of ordinary skill in the art.
In embodiments of the invention, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
As shown in fig. 1 to 3, an embodiment of the present invention provides a surgical navigation system, including:
the visual mark comprises a visual mark matrix 100, wherein visual characteristic points 110 are arranged on the outer side surface of the visual mark matrix 100 along the axial direction and the circumferential direction of the visual mark matrix 100;
the visual navigator 200, the visual navigator 200 is provided with an error control coding module, and the visual mark matrix 100 is electrically connected with the error control coding module. It is understood that the visual marker substrate 100 has a cylindrical shape, and the visual feature points 110 are provided on the outer side wall of the visual marker substrate 100. In order to facilitate the omnidirectional observation of the to-be-detected part of the human body, the visual feature points 110 are respectively arranged along the axial direction and the circumferential direction of the visual mark substrate 100. It should be noted that the distance between two adjacent visual feature points 110 can be specifically set according to the actual surgical needs, and can be set at equal intervals or set at set intervals.
Further, the visual navigator 200 is disposed opposite to the visual marker substrate 100, and the detection visual field of the visual navigator 200 covers the visual marker substrate 100, so as to detect the visual marker, and further indirectly acquire the position relative to the human body.
The visual navigator 200 is provided with an error control coding module, and the visual navigator 200 is provided with a visual coordinate system, so as to further acquire spatial coordinate point information of the visual feature point 110. The error control coding module carries out error correction coding processing on the acquired spatial coordinate point information, improves the detection accuracy, ensures the accuracy of the image displayed by the visual navigator 200, and ensures the smooth completion of the operation.
According to one embodiment of the invention, the outer side of the visual marker substrate 100 is provided with a groove 120, and the visual feature points 110 are removably disposed in the groove 120. It is understood that the outer side wall of the visual marker base body 100 is provided with grooves 120, and the grooves 120 are equally spaced in the axial and circumferential directions of the visual marker base body 100. Visual characteristic point 110 is detachable to locate in recess 120, realizes dismantling the change to visual characteristic point 110, guarantees the degree of accuracy that detects. It should be noted that the visual feature points 110 can be selectively installed in the grooves 120, that is, the visual feature points 110 can be installed in the grooves 120 continuously or at intervals according to the actual detection requirement, and the visual feature points 110 are not necessarily installed in each groove 120.
According to an embodiment of the present invention, a magnet 130 is disposed in the recess 120, and the mounting end of the visual feature point 110 is adapted to be attracted to the magnet 130. It can be understood that the magnet 130 is arranged in the groove 120, and the visual feature point 110 is made of a ferrous metal material, so that the mounting end of the visual feature point 110 is in adsorption fit with the magnet 130. That is to say, repeatedly using visual mark base member 100 many times may cause the reflective coating on visual characteristic point 110 to destroy, influence visual identification precision and effect, so need to change visual characteristic point 110, closely adsorb through magnet 130, guarantee the stable installation of visual characteristic point 110 in the testing process, when needing to change, only need with the help of external force with visual characteristic point 110 and magnet 130 break away from, dismantle can, convenient operation.
According to one embodiment of the present invention, the detection end of the visual feature point 110 is provided with a light reflecting member 140. It is understood that the detection end of the visual feature point 110 is provided with the light reflecting member 140 for corresponding detection adaptation with the visual navigator 200. The reflective member 140 is preferably a reflective sheet or a reflective ball. It should be noted that a reflective layer may be applied to the detection end of the visual feature point 110. The center of the circle of the reflector 140 is the coordinate position of the visual feature point 110.
According to an embodiment of the present invention, the detection end of the visual feature point 110 is provided with a counter-bore 150, and the light reflecting member 140 is disposed in the counter-bore 150. It can be understood that the detection of the visual feature point 110 is provided with the counter bore 150 for installing the reflector 140, so that the reflector 140 is stably installed in the counter bore 150, and the stable installation strength during the detection process is ensured.
According to one embodiment of the present invention, a surgical instrument 300 is also included, the surgical instrument 300 being coupled to the visual indicia substrate 100. It is understood that the surgical device 300 can be connected to the visual marker base 100 by a mechanical connection, such as a screw connection, a snap connection, a set screw connection or a welding connection, and the spatial position and posture of the surgical device 300 can be indirectly obtained by obtaining the spatial position of the visual marker base 100 through the visual navigator 200.
According to one embodiment of the invention, the visual navigator 200 also includes a display screen that is electrically connected to the error control coding module. It can be understood that the display screen is used for receiving the information processed by the error control coding module and displaying the information in the form of images for medical staff to view.
According to a second aspect of the invention, the surgical navigation system identification method comprises the following steps:
arranging the visual mark matrix 100 at a position to be detected of a human body;
collecting space coordinate point information of the visual feature point 110 in a visual coordinate system through the visual navigator 200;
the error correction coding processing is carried out on the collected information of the visual navigator 200 through an error control coding module;
the display error control coding module of the visual navigator 200 processes the calculated spatial position and posture of the visual marker substrate 100 in the visual coordinate system.
According to an embodiment of the present invention, in the step of performing error correction coding processing on the collected information of the visual navigator 200 by the error control coding module, the error control coding module filters all coordinate point information to form an information code group by more than three collinear coordinate points.
The surgical navigation system identification method provided by the embodiment of the invention comprises the following specific steps:
a plurality of visual characteristic points are equidistantly distributed along the axial direction of the cylindrical surface of the visual mark substrate, and a row of the characteristic points can form an information code group. Along the cylinderThe circumference of the surface is distributed with a plurality of rows of information code groups at equal intervals. Code element a in information code group i1 or 0, ai1 represents that the visual navigator 200 recognizes the feature point, ai0 represents that the visual navigator 200 fails to recognize the visual feature point for some reason (e.g., view obstruction).
The method is called error control coding, also called error correction coding, and the method is characterized by that in the information code sequence the redundant code elements, i.e. supervisory codes, are added to original code words, i.e. the original code words are changed into code words with a certain redundancy according to a certain rule, and the codes of every code word have a certain relationship. Different coding methods have different error detection and error correction capabilities, and the more the supervision code elements are added, the stronger the error correction capability is.
The present invention uses a cyclic code to detect random or burst errors in visual recognition. The cyclic code has strong error detection (correction) capability and also has cyclicity. I.e. the code group obtained after any code group is circularly shifted (circularly left-shifted or right-shifted), is still one code group in the code.
The information codes are grouped, then each group of information codes is added with a plurality of supervisory codes, the codes are called block codes and are represented by symbols (n, k), k is the number of bits of the information codes, n is the total number of the code groups, and r is equal to n-k and is the supervisory number of bits. The embodiment of the invention uses (7, 4) cyclic codes, that is, each code block is represented by 7-bit binary number, wherein 4 bits are information bits, and the rest 3 bits are supervisory bits, so that the total number of the code blocks is 2416 different code groups.
According to the correlation theory in the communication theory, setting the cyclic code generating polynomial as
g(x)=x3+x2+1, the generator matrix is:
Figure BDA0002647712450000081
or written as:
Figure BDA0002647712450000082
the elementary row transformation on G can be converted into a typical matrix:
Figure BDA0002647712450000083
the generator matrix is obtained and the whole code group is generated by it, i.e. [ a ]6a5a4a3a2a1a0]=[a6a5a4a3]G, in the resulting code group a6a5a4a3Is an information bit, a2a1a0An additional 3-bit parity bit. From this equation, 16 different code groups can be calculated as shown in the following table:
(7, 4) Cyclic code Table
Figure BDA0002647712450000091
It can be seen that the 16 different code groups in the table, whether cyclically right shifted or cyclically left shifted, have the result in the set of cyclic code groups. Minimum code distance d of the block code0If the number of the corresponding bits in any two code groups is different, the number is 3, and 2 error codes can be detected at most if the number is used for error detection; if used for error correction, can correct 1 error code.
The code words with the lowest value of 1 (excluding all code blocks of 1) are selected from the code blocks in the table above, and 7 code words are arranged and distributed on the outer contour surface of the cylindrical visual marker substrate. The code groups selected were as follows:
S1=0001101
S2=0010111
S3=0100011
S4=0111001
S5=1001011
S6=1010001
S7=1100101
theoretically, the visual feature points on the side of the visual marker substrate facing the visual navigator 200 can be detected. The code that can be detected continuously is related to its distribution on the cylindrical surface of the visual marking substrate. Here, 15 cycles of equidistant distribution on the cylindrical surface of the visual marking matrix are setRing code, facing the visual navigator 200 side is S1、S2、S3、S4、S5
The cyclic code is arranged on the cylindrical surface to ensure that a minimum of 3 consecutive sequences cannot be repeated, otherwise positioning ambiguity may occur.
The position and orientation of the visual marker substrate in the visual coordinate system V of the visual navigator 200 (shown in FIG. 1) can be indirectly estimated from the detected cyclic code.
The visual marker substrate may be fabricated by 3D printing or metal working, and the position of each visual feature point on the outer surface thereof is known relative to a visual marker substrate coordinate system (e.g., the coordinate system { O } located at the center of the bottom surface of the cylindrical surface in fig. 2).
Since the visual navigator 200 detects only a series of spatial coordinate points, in order to obtain specific code group information from these coordinate points, the following steps are performed in a loop:
(1) traversing all detected coordinate points, and recording more than three collinear points into a group { A }i};
(2) Due to the problems of visual feature point distribution, collinear discrimination threshold, sensor detection accuracy and the like, the condition that three or more points on the arc surface are collinear may occur. To avoid grouping the wrong ones, a discrimination is required. The distance between the characteristic points of each group of cyclic codes is L (as shown in figure 2), namely the distance from the point to the point in each group is integral multiple of L, the non-conforming groups are removed, and the rest { B } is lefti};
(3) Converting each group into binary numbers according to the distribution of the points to obtain { Ci}. In order to convert the cyclic codes into corresponding binary code groups only according to the spatial coordinate points, the code groups with the lowest value of 1 are selected in table 1. All the circles corresponding to the rightmost circle of the cylindrical surface of the visual mark base body in the figure 2 are provided with the light reflecting pieces. Therefore, the coordinate values of the X axis in the visual coordinate system can be compared during detection, and the coordinate point with the smallest X component is the boundary. Under the condition that the boundary and the characteristic point spacing are known, the binary code group sequence corresponding to the collinear coordinate point can be obtained. If the visual mark matrix in FIG. 1 is changed by the faceFor example, the end of the surgical instrument 300 faces to the right (i.e., to the negative direction of the X axis of the visual navigator 200), and then the traversal should be performed in the reverse direction, i.e., the maximum value of the X coordinate component is found to determine the boundary, and perform the code group sequence conversion;
(4) if the corresponding code calculated in the previous step is not in SiIn (some feature points are not detected due to visual occlusion, etc.), error correction is needed. E.g. cyclic code sequence S4If detected completely accurately, it should be: 0111001. If a visual feature point (underlined logo) is occluded, the recognized sequence becomes: 0101001 code is not in { SiIn (c) }. Since the code group has 3 parity bits added, there are 3 parity relations. And constructing a supervision matrix H according to the generation matrix G:
Figure BDA0002647712450000111
for a wrongly identified sequence B ═ 0101001]Calculate its syndrome: s ═ B · HT=[0101001]·HT=[111]It can be seen that the syndromes are not all 0, i.e. they do not satisfy the supervised relation, indicating an error. According to S ═ 111]Is the transposition of the 3 rd column of the H matrix, and the error code position is known to be in a4The error pattern is E ═ 0010000]Thus, the code group after error correction is:
A=B+E=[0101001]+[0010000]=[0111001]
(5) after the last step of error correction, 3-5 continuous sets of cyclic codes are selected, so that the spatial position and the posture of the cylindrical visual mark matrix (coordinate system { O }) in the visual coordinate system { V } of the navigation instrument can be uniquely determined, and a basis is provided for the positioning and navigation of the surgical instrument 300.
The calculation principle is as follows:
coordinate points in a visual coordinate system { V } corresponding to a plurality of groups of error-corrected cyclic codes form a space point set X, and coordinates of characteristic points in a cylindrical coordinate system { O } corresponding to one-to-one in the point set X are known to form another point set P. For the point sets with the determined corresponding relation, the optimal transformation can be calculated by a Singular Value Decomposition (SVD) method to obtain the space conversion relation between the two point sets, so that the space position and the posture of the cylinder under a visual coordinate system are calculated.
As shown in fig. 4, the set of points P ═ { P ═ P1,p2,...,pnX and X ═ X1,x2,...,xnRegistering, namely solving a rotation matrix R and a translation vector t to minimize the transformed error, namely solving R and t to minimize a registration error E (R, t):
Figure BDA0002647712450000121
according to the correlation theory, the mass centers of two point sets can be calculated firstly, which are respectively:
Figure BDA0002647712450000122
and
Figure BDA0002647712450000123
then, the corresponding centroids are respectively subtracted from the two point sets, and the point sets are moved to the origin to obtain two new point sets, i.e., X' ═ { X ═ix}={x′i},P′={pip}={p′i}。
Then construct the matrix W, order
Figure BDA0002647712450000124
The rotation matrix R is solved by performing a singular value decomposition on the matrix W. If it is
Figure BDA0002647712450000125
Then R ═ UVT. After the rotation matrix R is calculated, the translation vector t ═ mu can be solvedx-Rμp
After the spatial transformation relation of the two corresponding point sets is calculated, the visual mark matrix coordinate system { O } can be transformed into the visual coordinate system { V } of the navigation instrument, namely the position and the posture of the visual mark matrix relative to the navigation instrument are known.
According to an embodiment of the present invention, in the step of performing error correction coding processing on the collected information of the visual navigator 200 through the error control coding module, a supervisory code is added to the information code group through the error control coding module, and error correction coding processing is performed.
One or more technical solutions in the embodiments of the present invention have at least one of the following technical effects:
the visual characteristic points are arranged along the axial direction and the circumferential direction of the visual mark matrix, the visual navigator detects the space coordinates of the visual characteristic points in the visual field range of the visual navigator in real time, and the error control coding module carries out error correction coding processing on the detected space coordinates, so that the accurate space position and posture of the visual mark matrix are identified, and the detection precision and reliability are improved.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
The above embodiments are merely illustrative of the present invention and are not to be construed as limiting the invention. Although the present invention has been described in detail with reference to the embodiments, it should be understood by those skilled in the art that various combinations, modifications or equivalents may be made to the technical solution of the present invention without departing from the spirit and scope of the technical solution of the present invention, and the technical solution of the present invention is covered by the claims of the present invention.

Claims (10)

1. A surgical navigation system, comprising:
the visual mark comprises a visual mark matrix, wherein visual characteristic points are arranged on the outer side surface of the visual mark matrix along the axial direction and the circumferential direction of the visual mark matrix;
the visual navigator is provided with an error control coding module, and the visual mark matrix is electrically connected with the error control coding module.
2. The surgical guidance system of claim 1, wherein the outer side of the visual marker base body is provided with a groove, and the visual feature point is detachably arranged in the groove.
3. The surgical navigation system of claim 2, wherein a magnet is disposed in the recess, and the mounting end of the visual feature is adapted to be attracted to the magnet.
4. The surgical navigation system of claim 3, wherein the detection end of the visual feature point is provided with a light reflector.
5. The surgical navigation system of claim 4, wherein the detection end of the visual feature point is provided with a countersunk hole, and the reflector is disposed in the countersunk hole.
6. The surgical navigation system of claim 1, further comprising a surgical instrument coupled to the visual marker base.
7. The surgical navigation system of claim 1, wherein the visual navigator further includes a display screen electrically connected to the error control encoding module.
8. An identification method of a surgical navigation system according to any one of claims 1 to 7, comprising the steps of:
arranging the visual mark matrix at a position to be detected of a human body;
acquiring spatial coordinate point information of the visual feature points in a visual coordinate system through the visual navigator;
carrying out error correction coding processing on the acquired information of the visual navigator through the error control coding module;
and displaying the spatial position and the posture of the visual mark matrix under the visual coordinate system calculated by the error control coding module through the visual navigator.
9. The surgical guidance system identification method of claim 8, wherein in the error-correction encoding processing step of the information collected by the visual guidance instrument by the error-control encoding module, the error-control encoding module filters three or more collinear coordinate points through all coordinate point information to form an information code group.
10. The surgical navigation system identification method of claim 9, wherein in the step of performing error correction coding processing on the collected information of the visual navigator by the error control coding module, a supervisory code is added to an information code group by the error control coding module, and the error correction coding processing is performed.
CN202010859848.5A 2020-08-24 2020-08-24 Operation navigation system and identification method thereof Pending CN112043381A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010859848.5A CN112043381A (en) 2020-08-24 2020-08-24 Operation navigation system and identification method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010859848.5A CN112043381A (en) 2020-08-24 2020-08-24 Operation navigation system and identification method thereof

Publications (1)

Publication Number Publication Date
CN112043381A true CN112043381A (en) 2020-12-08

Family

ID=73600097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010859848.5A Pending CN112043381A (en) 2020-08-24 2020-08-24 Operation navigation system and identification method thereof

Country Status (1)

Country Link
CN (1) CN112043381A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102341055A (en) * 2008-12-31 2012-02-01 直观外科手术操作公司 Fiducial marker design and detection for locating surgical instrument in images
WO2013115640A1 (en) * 2012-01-31 2013-08-08 Umc Utrecht Holding B.V. Tracking of an endoscopic device
CN107072740A (en) * 2014-11-21 2017-08-18 思外科有限公司 The visible light communication system of data is transmitted between Visual Tracking System and tracking recognizer
CN107874832A (en) * 2017-11-22 2018-04-06 合肥美亚光电技术股份有限公司 Bone surgery set navigation system and method
CN109717956A (en) * 2019-01-16 2019-05-07 上海长海医院 Laser orientation instru-ment, operation guiding system and application method based on C arm X-ray machine
CN212326568U (en) * 2020-08-24 2021-01-12 雅客智慧(北京)科技有限公司 Surgical navigation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102341055A (en) * 2008-12-31 2012-02-01 直观外科手术操作公司 Fiducial marker design and detection for locating surgical instrument in images
WO2013115640A1 (en) * 2012-01-31 2013-08-08 Umc Utrecht Holding B.V. Tracking of an endoscopic device
CN107072740A (en) * 2014-11-21 2017-08-18 思外科有限公司 The visible light communication system of data is transmitted between Visual Tracking System and tracking recognizer
CN107874832A (en) * 2017-11-22 2018-04-06 合肥美亚光电技术股份有限公司 Bone surgery set navigation system and method
CN109717956A (en) * 2019-01-16 2019-05-07 上海长海医院 Laser orientation instru-ment, operation guiding system and application method based on C arm X-ray machine
CN212326568U (en) * 2020-08-24 2021-01-12 雅客智慧(北京)科技有限公司 Surgical navigation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑秋梅主编;刘新平,孙晓燕,宋会英副主编: "计算机组成原理", 31 July 2012, 中国石油大学出版社, pages: 64 - 69 *

Similar Documents

Publication Publication Date Title
JP4294025B2 (en) Method for generating interface surface and method for reading encoded data
KR101820682B1 (en) Marker for optical tracking, optical tracking system, and optical tracking method
US8086026B2 (en) Method and system for the determination of object positions in a volume
US6765195B1 (en) Method and apparatus for two-dimensional absolute optical encoding
US9117103B2 (en) Image information processing apparatus and method for controlling the same
JP2017053856A (en) Test pattern and method for calibrating x-ray imaging device
US9305354B2 (en) Apparatus and method for mapping a three-dimensional space in medical applications for diagnostic, surgical or interventional medicine purposes
BR0014449B1 (en) method for providing a position code on a surface, method for determining a position, device for determining position and product which makes it possible to determine a position.
CN212326568U (en) Surgical navigation system
JP3881696B2 (en) X-ray geometry calibration
EP3265009A1 (en) Redundant reciprocal tracking system
US20060226244A1 (en) Redundant two-dimensional code and a decoding method
KR100703698B1 (en) Apparatus and method for recognizing spatial writing and recording medium for recording the method
JP2005066345A (en) Multiple configuration array for surgical navigation system
JP2008181501A (en) Barcode pattern
Gadwe et al. Real-time 6dof pose estimation of endoscopic instruments using printable markers
US20170068879A1 (en) Absolute surface coding / encoding an area in absolute terms
WO2014058390A1 (en) Optical measurement system, method and scaleplate therefor
US11195303B2 (en) Systems and methods for characterizing object pose detection and measurement systems
CN113269193A (en) Pointer type meter reading method, device and storage medium
CN112043381A (en) Operation navigation system and identification method thereof
CN116452499A (en) Lumbar vertebra instability and slipping diagnosis system based on Unet network
Zhu et al. HydraMarker: Efficient, flexible, and multifold marker field generation
KR101199764B1 (en) Relative coordinate extraction device and medical images system using the same
CN110490941B (en) Telecentric lens external parameter calibration method based on normal vector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination