CN116077184A - Positioning device and method of AR operation navigation system - Google Patents

Positioning device and method of AR operation navigation system Download PDF

Info

Publication number
CN116077184A
CN116077184A CN202310072535.9A CN202310072535A CN116077184A CN 116077184 A CN116077184 A CN 116077184A CN 202310072535 A CN202310072535 A CN 202310072535A CN 116077184 A CN116077184 A CN 116077184A
Authority
CN
China
Prior art keywords
calibration object
black
camera
pose
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310072535.9A
Other languages
Chinese (zh)
Inventor
施旭
朱明�
王建华
陆春健
吴学海
林瓊
叶浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xvisio Technology Corp
Original Assignee
Xvisio Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xvisio Technology Corp filed Critical Xvisio Technology Corp
Priority to CN202310072535.9A priority Critical patent/CN116077184A/en
Publication of CN116077184A publication Critical patent/CN116077184A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • G06K7/1482Methods for optical code recognition the method including quality enhancement steps using fuzzy logic or natural solvers, such as neural networks, genetic algorithms and simulated annealing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Toxicology (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Automation & Control Theory (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of AR operation navigation, in particular to a positioning device and a positioning method of an AR operation navigation system, comprising the following steps: obtaining a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked; grabbing coded pictures under different pose states of the calibration object; calculating the current pose of the calibration object and the mutual spatial relationship among a plurality of calibration objects; graphically displaying the current pose of the calibration object and the mutual spatial relationship; the coded picture comprises a black-and-white coded pattern and a black-and-white grid pattern with specific code values, wherein the black-and-white grid pattern has specific sizes and arrangements; the identification of the calibration object of the black-and-white coded picture with special meaning is simple, accurate and reliable; the calibration object adopts a cylindrical structure, so that the problem that the calibration object cannot be clearly captured and analyzed by a camera after the angle and the position are changed can be effectively avoided; and the gray decoding threshold value is corrected by utilizing black and white grids, so that the decoding is more accurate, stable and reliable.

Description

Positioning device and method of AR operation navigation system
Technical Field
The invention relates to the technical field of AR operation navigation, in particular to a positioning device and method of an AR operation navigation system.
Background
In the traditional method, a plane calibration picture mode is adopted, a checkerboard black-and-white coding picture is arranged on one plane, and the change of the picture is acquired and calculated through a camera to obtain positioning information; and a device capable of emitting infrared rays or actively emitting infrared rays is bound on the surgical device to be calibrated in a form of reflecting or actively emitting infrared rays, and the position is achieved by acquiring and calculating the transformation of a plurality of point positions through a camera.
However, when the pose changes, the positioning mode of the plane calibration picture is adopted, enough identification characteristic points can not be acquired for the camera, so that a three-dimensional calibration method and calibration objects which can provide enough identification characteristic points in different poses are adopted; the calibration object capable of actively emitting infrared rays and the calibration mode require that a battery is arranged in the calibration object, so that the weight and the cost of the calibration object are increased, and additional work in use, disinfection and maintenance is performed; and if other reflection points exist in the use space, interference is easily formed on identification, so that tracking and positioning errors are caused.
Disclosure of Invention
The technical scheme adopted by the invention for solving the background technical problems is as follows:
the first aspect of the present invention provides a positioning method of an AR surgical navigation system, including:
obtaining a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked;
grabbing coded pictures under different pose states of the calibration object;
calculating the current pose of the calibration object and the mutual spatial relationship among a plurality of calibration objects;
graphically displaying the current pose of the calibration object and the mutual spatial relationship;
the coded picture includes:
a black and white coding pattern having a specific code value and a black and white grid pattern having a specific size and arrangement.
Preferably, after obtaining the calibration object having the specific size, shape and encoded picture that can be identified and tracked, the method further comprises:
the calibration object is arranged on the operation part and the operation appliance.
Preferably, the method further comprises the step of correcting the position and the posture of the calibration object:
correcting the pose of the calibration object according to the change of the code value of the black-and-white coding pattern;
filtering the interference in the environment according to the black-and-white grid pattern;
and obtaining the corrected correct pose.
Preferably, after capturing the coded pictures under different pose states of the calibration object, the method further comprises:
graying the captured image to obtain a gray image;
calculating the intensity and direction of the gradient on each pixel of the whole gray image;
the intensity and direction of the gradient are clustered.
Preferably, the edge weights of two adjacent pixels on the gray level image are the difference between gradient directions, all edges are arranged in an increasing order according to the edge weights, edge combination is performed by using the edge weights, and the clustering formula (1) is as follows:
D(nUm)=min(D(n),D(m))+K D /|nUm|
D(nUm)=min(M(n),M(m))+K M /|nUm|
wherein n and M are given two edge sets, D (n) is the difference between the maximum value and the minimum value of the gradient directions of the edges, M (n) is the difference between the maximum value and the minimum value of the gradient intensity, D (n) is required to be between 0 pi and 2 pi, so that the D (n) is subjected to redundant processing, and when n and M meet the formula (1), two groups of edges of n and M are combined into one group of edges.
Preferably, the method comprises:
after the clustering of all pixels in the gray level image is completed, fitting a linear equation by using a weighted least square method;
according to the fitting of the straight line, coordinates of two endpoints of the directed line segment are obtained and marked as p1 and p2;
judging the direction of the directional line segment p1p2 according to the brightness;
calculating black and white edges in the gray level image to find the outline of the component label image;
confirming a quadrilateral structure in the grabbed coded picture according to the corner points of the straight lines;
and obtaining the current pose of the calibration object through matrix transformation, and decoding according to a decoding rule.
Preferably, the confirmed quadrilateral structure is subjected to homography transformation to correctly decode and solve the current pose of the calibration object, and the formula (2) is as follows:
Figure BDA0004065144950000021
where s represents depth information,
Figure BDA0004065144950000022
representing the label image coordinate system, ">
Figure BDA0004065144950000023
A world coordinate system representing zw=0;
the obtained product is expanded and transformed to obtain a formula (3):
Figure BDA0004065144950000031
forming four point pairs according to the four corner points extracted from the label image coordinate system and the assumed four point correspondence in the world coordinate system;
defining lattice coordinates in the determined quadrangle;
extracting an average value A1 of pixels of the outermost periphery of the dot matrix in the gray image, and extracting an average value A2 of pixels of the secondary periphery of the dot matrix;
acquiring a threshold boundary between A1 and A2, determining a threshold as the average value of A1 and A2, traversing pixel values of all point coordinates of the whole lattice again, wherein the part higher than the threshold is coded as 0, and the part lower than the threshold is coded as 1;
decoding the whole lattice to obtain a string of binary codes;
matching to corresponding code ID according to the table lookup.
Preferably, the calculation of the marker tag pose formula (4) is as follows:
Figure BDA0004065144950000032
wherein H is a homography matrix, S is a scale factor, P is a camera projection matrix, E is an external reference matrix, fx and fy are camera internal references; expanding equation (4):
Figure BDA0004065144950000033
and calculating and obtaining an external parameter matrix of the camera.
Preferably, the method further comprises:
obtaining the position of a camera under a world coordinate system, wherein the formula (5) is as follows:
Figure BDA0004065144950000041
obtaining the position of the tag under the camera coordinate system, wherein the formula (6) is as follows:
Figure BDA0004065144950000042
and calculating the position of the camera under the world coordinate system and the position of the tag under the camera coordinate system according to the rotation matrix R and the translation matrix T from the world coordinate system to the camera coordinate system, and completing positioning.
A second aspect of the present invention proposes a positioning device of an AR surgical navigation system, comprising:
the calibration object module is used for acquiring a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked;
the camera module is used for grabbing coded pictures under different pose states of the calibration object;
the calculating module is used for calculating the mutual spatial relationship among the current pose of the calibration object and the plurality of calibration objects;
and the display module is used for graphically displaying the current pose of the calibration object and the mutual spatial relationship.
The technical scheme of the invention has the following beneficial technical effects:
the identification of the calibration object of the black-and-white coded picture with special meaning is simple, accurate and reliable; the calibration object adopts a cylindrical structure, so that the problem that the calibration object cannot be clearly captured and analyzed by a camera after the angle and the position are changed can be effectively avoided; and the gray decoding threshold value is corrected by utilizing black and white grids, so that the decoding is more accurate, stable and reliable.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a diagram of a label pattern in the present invention;
FIG. 3 is a schematic view of a quadrilateral structure according to the present invention.
Detailed Description
In order that the manner in which the above recited features, objects and advantages of the present invention are obtained, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Based on the examples in the embodiments, those skilled in the art can obtain other examples without making any inventive effort, which fall within the scope of the invention. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the present invention.
As shown in fig. 1, a positioning method of an AR operation navigation system provided in a first aspect of the present invention includes:
step S01, obtaining a calibration object with specific size, shape and coded picture label which can be identified and tracked;
formulating a cylinder and manufacturing a label:
the label is provided with a black-and-white coding pattern and a black-and-white grid pattern with specific code values, wherein the black-and-white grid pattern is provided with specific sizes and arrangements;
the label is arranged on the periphery of the outer surface of the cylinder to form a whole body, so that the calibration object is formed.
Each surgical device and surgical site (operated site) in the procedure is fitted with at least one marker.
S02, grabbing coded pictures under different pose states of the calibration object;
step S03, calculating the current pose of the calibration object and the mutual spatial relationship among a plurality of calibration objects;
and S04, graphically displaying the current pose of the calibration object and the mutual spatial relationship.
Specifically, calibrating the camera, and carrying out internal and external parameter and distortion correction on the camera; setting a calibration object with a specified size, a specified shape and a specified identification pattern, wherein the identification pattern is a black-and-white coding pattern and a black-and-white grid pattern with specific code values in the embodiment; installing a calibration object on the operation tool, registering the tool, and identifying and tracking the calibration object in the visual field by adopting a camera; correcting the pose of the calibration object according to the change of the code value of the black-and-white coding pattern; filtering the interference in the environment according to the black-and-white grid pattern; and obtaining the corrected correct pose, and graphically displaying the corrected correct pose on a display device.
As an example of the present invention, a calibration object with the special calibration code is installed on a patient operation site and an operation instrument, and a camera is used for scanning and identifying the preset calibration object to calculate and determine the position, distance and posture of the corresponding object. Rigidly attached to each object to be tracked is a marker with a specific size and meaning of a specific code value, for example, a marker a for binding a surgical tool, which is a cylinder with a radius of 20mm, is assumed, and the identification pattern on the label surface is shown in fig. 2.
The large squares of the central part are squares of side length 1.0053cm and each square has its own corresponding ID 200, 201, 202.
Assume another marker B, which is a cylinder with a radius of 20mm, is attached to the surgical site. The surface carries an identification pattern with an ID of 250, 251, 252. After the camera acquires the picture containing the calibration object, the input image is subjected to graying treatment to form a gray scale image, the intensity and the direction of the gradient on each pixel on the whole image are calculated, and then the intensity and the direction of the gradient are clustered.
The method comprises the following specific steps:
graying the captured image to obtain a gray image;
calculating the intensity and direction of the gradient on each pixel of the whole gray image;
the intensity and direction of the gradient are clustered.
It should be noted that, the clustering algorithm is a graph-based algorithm: the nodes of the graph are each pixel on the image, two adjacent pixels on the image have an edge, and the weight of the edge is the difference between gradient directions. And (3) carrying out order-increasing arrangement on all the edges according to the weight values of the edges, and finally carrying out edge combination according to the weight values of the edges.
Specifically, the edge weights of two adjacent pixels on the gray level image are the difference between gradient directions, all edges are arranged in an increasing order according to the edge weights, edge combination is performed by using the edge weights, and the clustering formula (1) is as follows:
D(nUm)=min(D(n),D(m))+K D /|nUm|
D(nUm)=min(M(n),M(m))+K M /|nUm|
wherein n and M are given two edge sets, D (n) is the difference between the maximum value and the minimum value of the gradient directions of the edges, M (n) is the difference between the maximum value and the minimum value of the gradient intensity, D (n) is required to be between 0 pi and 2 pi, so that the D (n) is subjected to redundant processing, and when n and M meet the formula (1), two groups of edges of n and M are combined into one group of edges.
The above formula (1) constrains the gradient direction of the set to be close, K, before and after merging D And K M It is ensured that merging is possible with a small number of collection elements.
Further is:
after the clustering of all pixels in the gray level image is completed, fitting a linear equation by using a weighted least square method;
according to the fitting of the straight line, coordinates of two endpoints of the directed line segment are obtained and marked as p1 and p2;
judging the direction of the directional line segment p1p2 according to the brightness;
the direction of the directional line segment p1p2 is that the left brightness of the line segment p1p2 is lower than the right brightness;
calculating black and white edges in the gray level image to find the outline of the component label image;
confirming a quadrilateral structure in the grabbed coded picture according to the corner points of the straight lines;
and obtaining the current pose of the calibration object through matrix transformation, and decoding according to a decoding rule.
The found quadrangle generally has affine transformation, in order to decode and solve the pose correctly, the confirmed quadrangle structure is subjected to homography transformation through line homography transformation, so as to decode and solve the current pose of the calibration object correctly, and the formula (2) is as follows:
Figure BDA0004065144950000061
where s represents depth information,
Figure BDA0004065144950000062
representing the label image coordinate system, ">
Figure BDA0004065144950000063
A world coordinate system representing zw=0;
the obtained product is expanded and transformed to obtain a formula (3):
Figure BDA0004065144950000064
through the above, a matrix solving problem of ah=0 is finally generated, and four point pairs are formed according to the four corner points extracted from the label image coordinate system and the assumed four point correspondence in the world coordinate system; the system of linear equations is then solved using direct linear method (DLT).
Referring to fig. 3, lattice coordinates are clarified in the determined quadrangle;
extracting an average value A1 of pixels of the outermost periphery of the dot matrix in the gray image, and extracting an average value A2 of pixels of the secondary periphery of the dot matrix; acquiring a threshold boundary between A1 and A2, determining a threshold as the average value of A1 and A2, traversing pixel values of all point coordinates of the whole lattice again, wherein the part higher than the threshold is coded as 0, and the part lower than the threshold is coded as 1; decoding the whole lattice to obtain a string of binary codes; matching to corresponding code ID according to the table lookup.
As can be seen from fig. 3, according to the design of the two-dimensional code library, the gray values of all points of the outermost layer of the quadrangle are black, and the gray values of the outer layer are black and white, so that in the same illumination environment, A1 and A2 have obvious threshold boundaries.
Because the cylindrical design is adopted, the reflection angles of the different positions of the cylindrical body to the ambient light are also different, and therefore, the brightness of the different positions can be different when an image is captured at a certain visual angle, therefore, the black and white grid pattern is utilized to help to correct the threshold value, and specifically, the gray value of black pixels of the black and white grid pattern on each main bus on the cylindrical surface and the gray values of the pixels of all points of the black and white grid pattern on the same main bus are used for correcting the threshold values of A1 and A2, so that more accurate coding information is obtained.
Specifically, decoding is performed from the first row until the whole dot is decoded, a series of binary codes can be obtained by arranging codes, each quadrangle can obtain a series of binary codes, the codes represent codes of two-dimensional codes (namely black-and-white coding patterns of specific code values in labels) in the state, and then the codes are matched with corresponding code IDs in a table look-up mode, so that the decoding and matching verification process is completed.
Since estimating the pose of the tag (pose of the calibration object) requires camera internal parameters, physical dimensions of the tag, homography matrix, and external parameters matrix (rotation matrix+translation matrix).
The physical dimensions of the camera internal reference and the tag can be calibrated in advance, and the homography matrix is obtained through the formulas (2) and (3), so that only the rotation matrix and the translation matrix are needed.
Specifically, the calculation of the label pose formula (4) of the calibration object is as follows:
Figure BDA0004065144950000071
wherein H is a homography matrix, S is a scale factor, P is a camera projection matrix, E is an external reference matrix, fx and fy are camera internal references; expanding equation (4):
Figure BDA0004065144950000081
and calculating and obtaining an external parameter matrix of the camera.
In the above formula, the scale factor s represents a dot scaling translation relationship in the projection process, the camera projection matrix P represents a transformation from a three-dimensional world coordinate system to a two-dimensional image coordinate system, and the external reference matrix E represents a rotation matrix r+a translation matrix T.
Rij and Tk can be determined from the homography matrix hij and the camera parameters fx and fy. Using constraint conditions: the modular length of the column vector R in the rotation matrix is 1, and the size of s can be calculated; whereas the constraint scaling factor must guarantee T < 0, derived from the tag must be in front of the camera, resulting in a sign of s. Since the rotation matrix R is an orthogonal matrix, the column vector truncated by R can be solved by its inner product with the first two sets of column vectors being 0.
To this end, a rotation matrix R and a translation matrix T converted from the world coordinate system to the camera coordinate system, i.e., an external reference matrix of the camera, are obtained. Specific:
obtaining the position of a camera under a world coordinate system, wherein the formula (5) is as follows:
Figure BDA0004065144950000082
obtaining the position of the tag under the camera coordinate system, wherein the formula (6) is as follows:
Figure BDA0004065144950000083
and calculating the position of the camera under the world coordinate system and the position of the tag under the camera coordinate system according to the rotation matrix R and the translation matrix T from the world coordinate system to the camera coordinate system, and completing positioning.
A positioning device of an AR surgical navigation system according to a second aspect of the present invention includes:
the calibration object module is used for acquiring a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked;
the camera module is used for grabbing coded pictures under different pose states of the calibration object;
the calculating module is used for calculating the mutual spatial relationship among the current pose of the calibration object and the plurality of calibration objects;
and the display module is used for graphically displaying the current pose of the calibration object and the mutual spatial relationship.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, and that the above-described embodiments and descriptions are only preferred embodiments of the present invention, and are not intended to limit the invention, and that various changes and modifications may be made therein without departing from the spirit and scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (10)

1. A method for positioning an AR surgical navigation system, comprising:
obtaining a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked;
grabbing coded pictures under different pose states of the calibration object;
calculating the current pose of the calibration object and the mutual spatial relationship among a plurality of calibration objects;
graphically displaying the current pose of the calibration object and the mutual spatial relationship;
the coded picture includes:
a black and white coding pattern having a specific code value and a black and white grid pattern having a specific size and arrangement.
2. The positioning method of an AR surgical navigation system according to claim 1, further comprising, after acquiring a calibration object having a specific size, shape and encoded picture that can be identified and tracked:
the calibration object is arranged on the operation part and the operation appliance.
3. The positioning method of an AR surgical navigation system according to claim 2, further comprising the step of correcting the pose of the calibration object:
correcting the pose of the calibration object according to the change of the code value of the black-and-white coding pattern;
filtering the interference in the environment according to the black-and-white grid pattern;
and obtaining the corrected correct pose.
4. The positioning method of an AR surgical navigation system according to claim 3, further comprising, after capturing the coded pictures in different pose states of the calibration object:
graying the captured image to obtain a gray image;
calculating the intensity and direction of the gradient on each pixel of the whole gray image;
the intensity and direction of the gradient are clustered.
5. The positioning method of an AR surgery navigation system according to claim 4, wherein the edge weights of two adjacent pixels on the gray level image are the differences between gradient directions, all edges are arranged in an increasing order according to the edge weights, the edge weights are used for edge combination, and a clustering formula (1) is as follows:
D(nUm)=min(D(n),D(m))+K D /|nUm|
D(nUm)=min(M(n),M(m))+K M /|nUm|
wherein n and M are given two edge sets, D (n) is the difference between the maximum value and the minimum value of the gradient directions of the edges, M (n) is the difference between the maximum value and the minimum value of the gradient intensity, D (n) is required to be between 0 pi and 2 pi, so that the D (n) is subjected to redundant processing, and when n and M meet the formula (1), two groups of edges of n and M are combined into one group of edges.
6. The positioning method of an AR surgical navigation system according to claim 5, comprising:
after the clustering of all pixels in the gray level image is completed, fitting a linear equation by using a weighted least square method;
according to the fitting of the straight line, coordinates of two endpoints of the directed line segment are obtained and marked as p1 and p2;
judging the direction of the directional line segment p1p2 according to the brightness;
calculating black and white edges in the gray level image to find the outline of the component label image;
confirming a quadrilateral structure in the grabbed coded picture according to the corner points of the straight lines;
and obtaining the current pose of the calibration object through matrix transformation, and decoding according to a decoding rule.
7. The positioning method of an AR surgical navigation system according to claim 6, wherein the confirmed quadrilateral structure is homography transformed to correctly decode and solve the current pose of the calibration object, and formula (2) is as follows:
Figure QLYQS_1
where s represents depth information,
Figure QLYQS_2
representing the label image coordinate system, ">
Figure QLYQS_3
A world coordinate system representing zw=0;
the obtained product is expanded and transformed to obtain a formula (3):
Figure QLYQS_4
forming four point pairs according to the four corner points extracted from the label image coordinate system and the assumed four point correspondence in the world coordinate system;
defining lattice coordinates in the determined quadrangle;
extracting an average value A1 of pixels of the outermost periphery of the dot matrix in the gray image, and extracting an average value A2 of pixels of the secondary periphery of the dot matrix;
acquiring a threshold boundary between A1 and A2, determining a threshold as the average value of A1 and A2, traversing pixel values of all point coordinates of the whole lattice again, wherein the part higher than the threshold is coded as 0, and the part lower than the threshold is coded as 1;
decoding the whole lattice to obtain a string of binary codes;
matching to corresponding code ID according to the table lookup.
8. The positioning method of an AR surgical navigation system according to claim 7, wherein the calculation of the marker tag pose formula (4) is as follows:
Figure QLYQS_5
wherein H is a homography matrix, S is a scale factor, P is a camera projection matrix, E is an external reference matrix, fx and fy are camera internal references; expanding equation (4):
Figure QLYQS_6
and calculating and obtaining an external parameter matrix of the camera.
9. The method for locating an AR surgical navigation system according to claim 8, further comprising:
obtaining the position of a camera under a world coordinate system, wherein the formula (5) is as follows:
Figure QLYQS_7
obtaining the position of the tag under the camera coordinate system, wherein the formula (6) is as follows:
Figure QLYQS_8
and calculating the position of the camera under the world coordinate system and the position of the tag under the camera coordinate system according to the rotation matrix R and the translation matrix T from the world coordinate system to the camera coordinate system, and completing positioning.
10. A positioning device of an AR surgical navigation system, comprising:
the calibration object module is used for acquiring a calibration object with a specific size, a specific shape and a coded picture label which can be identified and tracked;
the camera module is used for grabbing coded pictures under different pose states of the calibration object;
the calculating module is used for calculating the mutual spatial relationship among the current pose of the calibration object and the plurality of calibration objects;
and the display module is used for graphically displaying the current pose of the calibration object and the mutual spatial relationship.
CN202310072535.9A 2023-02-07 2023-02-07 Positioning device and method of AR operation navigation system Pending CN116077184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310072535.9A CN116077184A (en) 2023-02-07 2023-02-07 Positioning device and method of AR operation navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310072535.9A CN116077184A (en) 2023-02-07 2023-02-07 Positioning device and method of AR operation navigation system

Publications (1)

Publication Number Publication Date
CN116077184A true CN116077184A (en) 2023-05-09

Family

ID=86204181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310072535.9A Pending CN116077184A (en) 2023-02-07 2023-02-07 Positioning device and method of AR operation navigation system

Country Status (1)

Country Link
CN (1) CN116077184A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116687569A (en) * 2023-07-28 2023-09-05 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116776909A (en) * 2023-08-28 2023-09-19 四川星点网络技术有限公司 Bottle lid two-dimensional code system of tracing to source

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116687569A (en) * 2023-07-28 2023-09-05 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116687569B (en) * 2023-07-28 2023-10-03 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116776909A (en) * 2023-08-28 2023-09-19 四川星点网络技术有限公司 Bottle lid two-dimensional code system of tracing to source
CN116776909B (en) * 2023-08-28 2023-11-03 四川星点网络技术有限公司 Bottle lid two-dimensional code system of tracing to source

Similar Documents

Publication Publication Date Title
CN116077184A (en) Positioning device and method of AR operation navigation system
CN106372702B (en) Positioning identifier and positioning method thereof
US10452938B2 (en) System and method for pattern detection and camera calibration
CN111145238A (en) Three-dimensional reconstruction method and device of monocular endoscope image and terminal equipment
CN109215016B (en) Identification and positioning method for coding mark
CN104506838A (en) Method, device and system for sensing depth of symbol array surface structured light
CN112132907B (en) Camera calibration method and device, electronic equipment and storage medium
CN109523551B (en) Method and system for acquiring walking posture of robot
CN113712665B (en) Positioning method and device based on positioning marker and computer storage medium
CN108717709A (en) Image processing system and image processing method
WO2013166995A1 (en) Method for decoding matrix-type two-dimensional code
CN109493384B (en) Camera pose estimation method, system, device and storage medium
WO2014199196A1 (en) Pose determination from a pattern of four leds
CN110096920A (en) A kind of high-precision high-speed positioning label and localization method towards visual servo
Gan et al. Multi-dimensional mutual information based robust image registration using maximum distance-gradient-magnitude
Furukawa et al. Fully auto-calibrated active-stereo-based 3d endoscopic system using correspondence estimation with graph convolutional network
CN101339604B (en) Novel mark point graph and its recognition, tracking and positioning algorithm based on visual sense invariance
Gu et al. Matching 3d shapes using 2d conformal representations
CN115609591A (en) 2D Marker-based visual positioning method and system and composite robot
Wang et al. Coarse-to-fine dot array marker detection with accurate edge localization for stereo visual tracking
CN110570354A (en) Strip chessboard calibration plate-based close-range image splicing method
JP2014082678A (en) Marker embedding device, marker detection device, marker embedding method, marker detection method, and program
US20220335649A1 (en) Camera pose determinations with depth
US11620809B2 (en) Robust fiducial marker for flexible surfaces
CN111179347A (en) Positioning method, positioning device and storage medium based on regional characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination