CN116138878A - Laser navigation calibration method - Google Patents

Laser navigation calibration method Download PDF

Info

Publication number
CN116138878A
CN116138878A CN202211550935.8A CN202211550935A CN116138878A CN 116138878 A CN116138878 A CN 116138878A CN 202211550935 A CN202211550935 A CN 202211550935A CN 116138878 A CN116138878 A CN 116138878A
Authority
CN
China
Prior art keywords
detector
laser
image
calibration plate
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211550935.8A
Other languages
Chinese (zh)
Inventor
王光鑫
郭强
王雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tuodao Medical Technology Co Ltd
Original Assignee
Tuodao Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tuodao Medical Technology Co Ltd filed Critical Tuodao Medical Technology Co Ltd
Priority to CN202211550935.8A priority Critical patent/CN116138878A/en
Publication of CN116138878A publication Critical patent/CN116138878A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/587Alignment of source unit to detector unit
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/588Setting distance between source unit and detector unit
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/589Setting distance between source unit and patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a laser navigation calibration method, which comprises the following steps: fixing the calibration plate in parallel at a first position spaced a first distance from the detector; controlling the laser module to obliquely project laser to the calibration plate, shooting a first depth image of the calibration plate through the depth camera, and obtaining a first detection image of the calibration plate through the detector; fixing the calibration plate at a second position which is spaced from the detector by a second distance in parallel, shooting a second depth image of the calibration plate through a depth camera, and obtaining a second detection image of the calibration plate through the detector; and calculating the incidence angle of the laser module and the position mapping relation between the laser module and the detector according to the first depth image, the second depth image, the first detection image and the second detection image. The embodiment of the application realizes the calibration of the laser module.

Description

Laser navigation calibration method
Technical Field
The application relates to the field of laser navigation, in particular to a laser navigation calibration method.
Background
With the development and progress of technology, more and more emerging technologies are applied to medical operations, and minimally invasive operations performed by mechanical arms are becoming popular with the advantages of small operation wounds, quick wound recovery and the like. Especially in the orthopedic operation, the traditional orthopedic operation has higher requirements on operation experience and medical technology, especially spinal operations due to the fact that focal points are usually smaller, the environments of the focal points are complex and sensitive, and the like.
Because the focus of a common orthopedic operation is smaller and inconvenient to observe, the traditional orthopedic operation generally needs to open a large wound, the wound on a patient is large, the postoperative rehabilitation period is long, the risk of the postoperative rehabilitation is large, and the operation can be completed on a small wound through the cooperation of a high-precision mechanical arm and a navigation system. In the related art, the navigation system performs operation navigation through the cooperation of the optical tracker and the C-arm machine, and the equipment has the advantages of complex operation, high equipment cost and high operation cost, so that the popularization rate is not high.
Disclosure of Invention
For solving the problem of bone surgery navigation, this application provides a laser navigation calibration method for perspective imaging equipment, perspective imaging equipment includes detector, laser module, wherein, the laser module is installed on the detector, still install the depth camera on the detector, the method includes:
fixing a calibration plate in parallel at a first position spaced a first distance from the detector, wherein the calibration plate is provided with a mark which can be detected by the detector;
controlling the laser module to obliquely project laser to the calibration plate, shooting a first depth image of the calibration plate through the depth camera, and obtaining a first detection image of the calibration plate through the detector;
fixing the calibration plate in a second position which is separated from the detector by a second distance, wherein the second distance is different from the first distance, the second depth image of the calibration plate is shot by the depth camera, and a second detection image of the calibration plate is obtained by the detector;
calculating an incident angle of the laser module according to a first position relation between the marked center point and the laser projection point in the first depth image, first depth information of the calibration plate, a second position relation between the marked center point and the laser projection point in the second depth image, and second depth information of the calibration plate, and obtaining an angle calibration relation between the laser module and the detector, wherein the marked center point is obtained according to the first detection image or the second detection image;
and obtaining the position mapping relation between the laser module and the detector according to the second position relation, the second depth information and the incident angle.
In some embodiments, the indicia are disposed on a top corner of the calibration plate.
In some embodiments, four cross lead marks which are symmetrical around the center point of the calibration plate are arranged on the calibration plate, and the cross lead marks are the marks.
In some embodiments, calculating the incident angle of the laser module according to the first position relation between the marked center point and the laser projection point in the first depth image and the first depth information of the calibration plate, the second position relation between the marked center point and the laser projection point in the second depth image and the second depth information of the calibration plate includes:
fusing the first depth image and the first detection image to obtain a first fused image;
fusing the second depth image and the second detection image to obtain a second fused image;
establishing a coordinate system in the first fusion image and the second fusion image respectively, wherein the origin of coordinates of the coordinate system is set as the center point of the first fusion image or the second fusion image, and the center points of the first fusion image and the second fusion image are the projections of the center point of the detector;
and calculating the incident angle of the laser module through a trigonometric function based on the coordinate system, the first depth information and the second depth information of the calibration plate, the first position relation between the marked center point and the laser projection point in the first depth image and the second position relation between the marked center point and the laser projection point in the second depth image.
In some embodiments, the indicia is a cross lead label, the x-axis of the coordinate system is parallel to one side of the cross lead label, and the y-axis of the coordinate system is parallel to the other side of the cross lead label.
In some embodiments, obtaining the positional mapping relationship between the laser module and the detector according to the second positional relationship, the second depth information, and the incident angle includes:
and calculating the distance between the position of the laser module on the detector and the center point of the detector through a trigonometric function relation based on the second position relation, the second depth information and the incident angle, and determining the distance and the incident angle as the position mapping relation of the laser module and the detector.
In some embodiments, the depth camera is flush with a plane of the detector toward the detection plate.
In some embodiments, the fluoroscopic imaging apparatus includes a display electrically connected to the detector for displaying the first detected image and the second detected image obtained by the detector.
In some embodiments, the laser module includes a first laser source and a second laser source distributed on adjacent and perpendicular sides of the detector.
In some embodiments, the depth camera is flush with a plane of the detector toward the detection plate. In some embodiments, the depth camera is flush with a plane of the detector toward the detection plate.
The laser navigation calibration method and the offset angle measurement method have the beneficial effects that:
according to the laser navigation calibration method, the calibration plate and the depth camera can be used for completing the position and angle calibration of the laser module, so that fewer tools are required, and the calibration cost is low; the calibration plate used in the embodiment of the application is provided with the mark which can be detected by the detector, and based on the corresponding relation between the depth image shot by the depth camera and the detection image of the detector, the laser module is calibrated rapidly before operation, and the popularization of laser navigation in the orthopedic operation is facilitated.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
A schematic structural diagram of an X-ray machine is exemplarily shown in fig. 1;
a schematic bottom view of a detector is shown schematically in fig. 2;
a schematic side view of a detector is shown schematically in fig. 3;
a schematic diagram of an X-ray detection scenario is exemplarily shown in fig. 4;
a flow diagram of a laser navigation calibration method is exemplarily shown in fig. 5;
a schematic diagram of a calibration plate is shown schematically in fig. 6;
a schematic of a calibration plate in a first position is shown schematically in fig. 7;
a schematic of a calibration plate in a second position is schematically shown in fig. 8;
a schematic diagram of a first depth image is exemplarily shown in fig. 9;
a schematic diagram of a first detected image is exemplarily shown in fig. 10;
a schematic of a first fused image is shown schematically in fig. 11.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The presently disclosed embodiments will now be described in detail with reference to the drawings, wherein like reference numerals designate identical or corresponding elements in each of the several views.
The embodiment of the application provides a laser navigation calibration method, which is used for perspective imaging equipment, a mechanism of the perspective imaging equipment can be an X-ray machine, and referring to fig. 1, the structure schematic diagram of the X-ray machine provided by the embodiment of the application is shown in fig. 1, and the X-ray machine comprises a machine head assembly 1, a detector shell 2, a laser module 3 and a detector 4, wherein the laser module 3 is arranged on the detector shell 2 of the detector 4, and the center of the machine head assembly 1 and the center of the detector 4 are on the same axis.
Referring to fig. 2, which is a schematic bottom view of the detector, as shown in fig. 2, the laser module 3 may include a first laser source 31 and a second laser source 32. Wherein the first laser source 31 and the second laser source 32 are respectively arranged at two adjacent sides of the detector 4, so that a cross laser line can be projected under the detector 4, and the first laser source 31 and the second laser source 32 can respectively reciprocate along the illustrated direction to adjust the position of the cross laser line.
Referring to fig. 3, which is a schematic side view of the detector, as shown in fig. 3, taking the first laser source 31 as an example, a laser projection point of the first laser source 31 on an obstacle between the detector 4 and the head assembly 1 is a laser point 11, and the first laser source 31 may rotate around a first rotation pivot 12 of the first laser source 31 in a plane perpendicular to the detector 4 to change an incident angle θ of the first laser source 31 1 Similarly, the second laser source 32 may be rotated about the second rotation fulcrum 13 of the second laser source 32 in a plane perpendicular to the detector 4 to vary the incident angle θ of the second laser source 32 2
Based on the X-ray machine shown in fig. 1-3, a detection scenario of X-rays can be seen in fig. 4, as shown in fig. 4, the X-ray machine further comprising a display 5 electrically connected to the detector. The object 6 is fixed between the head assembly 1 and the detector 4, the laser module 3 projects a cross laser line on the object 6, an X-ray image of the object can be obtained by the detector 4, and the cross laser line on the object 6 cannot be detected by the detector 4, so that the cross laser line is not displayed on the X-ray image.
In the process of performing an operation on a measured object, the position of the laser point 11 of the cross laser line on the X-ray image is the target execution position of the surgical instrument, so that for facilitating the operator to observe on the X-ray image, the cross laser line corresponding to the laser module 3 can be virtualized on the X-ray image, and for virtualizing the cross laser line, the relative positions of the laser module 3 and the detector 4 need to be calibrated.
Referring to fig. 5, a laser navigation calibration method according to an embodiment of the present application, as shown in fig. 5, may include the following steps:
step S101: and fixing a calibration plate in parallel at a first position which is spaced from the detector by a first distance, wherein the calibration plate is provided with a mark which can be detected by the detector.
In some embodiments, referring to fig. 6, which is a schematic diagram of a calibration plate used in the embodiments of the present application, as shown in fig. 6, the calibration plate 13 may be a rectangular plate, and a mark, such as a cross lead mark, that can be detected by the detector 4 may be disposed on the calibration plate 13. Four cross lead marks 131 which are symmetrical in center around the center point of the calibration plate 13 can be arranged on the calibration plate 13, and the physical length of the cross lead marks 131 is a. The lead cross 131 can be detected by the detector 4 and imaged on an X-ray image.
Before performing an operation, in order to calibrate the angle and the relative position of the laser module 3 and the probe 4, a calibration plate 13 shown in fig. 6 may be fixed at a first position between the probe 4 and the head assembly 1 instead of the object 6 to be measured, so that the cross lead 131 faces the probe 4.
Step S102: and controlling the laser module to obliquely project laser to the calibration plate, shooting a first depth image of the calibration plate through the depth camera, and obtaining a first detection image of the calibration plate through the detector.
In some embodiments, after fixing the calibration plate 13 in the first position, referring to fig. 7, the first laser source 31 in the laser module 3 can be controlled to obliquely project laser light to the calibration plate 13, and the first laser source 32 in the laser module 3 can be controlled to obliquely project laser light to the calibration plate 13, where the incident angle of the first laser source 31 is θ 1 The incident angle of the second laser source 32 is θ 2 ,θ 1 And theta 2 Is of unknown size, the center point of the cross lead mark 131 on the calibration plate 13 is p 1
In some embodiments, a depth camera 15 may also be provided on the X-ray machine, as shown in fig. 7, the depth camera 15 may be provided on the detector 4, towards the calibration plate 13. The depth camera 15 can capture a depth image of the calibration plate 13, and the depth image is referred to as a first depth image, and the first depth image includes depth information of the calibration plate 13 and is referred to as first depth information. If the depth camera 15 is installed below the detector 4, the distance H between the calibration plate 13 and the detector 4 corresponding to the first position can be obtained according to the sum of the first depth information and the self height of the depth camera 15 1 The method comprises the steps of carrying out a first treatment on the surface of the If the depth camera 15 is installed at a height parallel to the detector 4, the first depth information is between the calibration plate 13 and the detector 4 corresponding to the first positionPitch H 1 The method comprises the steps of carrying out a first treatment on the surface of the The difference in height between the pivot point 12 of the first laser source 31 and the bottom of the detector 4 is denoted as h.
In some embodiments, a detection image of the calibration plate 13 is also acquired by the detector 4, which is noted as a first detection image.
Step S103: and fixing the calibration plate at a second position which is separated from the detector by a second distance, wherein the second distance is different from the first distance, the second depth image of the calibration plate is shot by the depth camera, and a second detection image of the calibration plate is obtained by the detector.
In some embodiments, after the first depth image is captured, referring to fig. 8, the first calibration plate 13 is moved in parallel and fixed to the second position between the detector 4 and the head assembly 1 without changing the projection angle of the laser module 3, and the depth image of the calibration plate 13 is captured again by the depth camera 15, where the incident angle of the first laser source 31 is still θ 1 The center point of the cross lead mark 131 on the calibration plate 13 is p 2 The depth image is referred to as a second depth image, and the second depth image includes depth information of the calibration plate 13, and the depth information is referred to as second depth information. The distance H between the calibration plate 13 and the detector 4 can be obtained according to the second depth information 2
In some embodiments, a detection image of the calibration plate 13 is also acquired by the detector 4, which detection image is denoted as a second detection image.
Step S104: and calculating the incident angle of the laser module according to the first position relation between the marked center point and the laser projection point in the first depth image, the first depth information of the calibration plate, the second position relation between the marked center point and the laser projection point in the second depth image and the second depth information of the calibration plate, and obtaining the angle calibration relation between the laser module and the detector, wherein the marked center point is obtained according to the first detection image or the second detection image.
In some embodiments, the angular calibration relationship of the laser module to the detector includes a first laser source31 incident angle theta 1 And an incident angle theta of the second laser source 32 2
In some embodiments, the center point of the lead cross 131 and the laser projection point p can be obtained from the first depth image 1 According to the second depth image, the center point of the lead cross 131 and the laser projection point p can be obtained 2 Is a second positional relationship of (a).
Referring to fig. 9, which is a schematic diagram of a first depth image obtained by a depth camera when the calibration plate 13 is in the first position, in fig. 9, the center point of the cross laser line 7 projected by the laser module 3 on the calibration plate 13, i.e. the laser projection point is p 1 . One of the lead crosses 131, such as the lead cross in the upper left corner, is set as the reference mark, and the center point of the reference mark is set to be p 1 Is denoted as L 1 The center point of the reference lead mark is compared with p 1 The second direction distance of (2) is denoted as V 1 Wherein the first direction is the reciprocating direction of the first laser source 31, the second direction is the reciprocating direction of the second laser source 32, and the first positional relationship comprises the center point of the reference mark and p 1 Is arranged between the first and second direction distances.
Referring to fig. 10, which is a schematic diagram of a first detection image obtained by the detector 4 when the calibration plate 13 is located at the first position, in fig. 10, a coordinate system is established with the center of the first detection image as the origin O, the first direction as the x-axis direction, and the second direction as the y-axis direction, so that the distance between the center point of the reference mark and the origin O in the x-axis direction is T 1 The distance between the center point of the reference mark and the origin O in the y-axis direction is S 1
In some embodiments, the first depth image and the first detection image may be image-fused according to the position of the reference mark 131 to obtain a first fused image as shown in fig. 11, and based on the first fused image, a laser projection point P may be obtained 1 Distance x from origin O in x-axis direction 1 The method comprises the following steps: x is x 1 =T 1 -L 1 Center point P of laser projection point 1 Distance y from origin O in y-axis direction 1 The method comprises the following steps: y is 1 =V 1 -S 1
Similarly, in the second depth image, the first direction distance between the center point of the reference mark and the laser projection point p2 is L 2 Center point of reference mark and p 2 Is a second direction distance of V 2 The second positional relationship comprises the center point of the reference mark and p 2 A first directional distance and a second directional distance; in the second detection image, the distance between the center point of the reference mark and the origin O in the x-axis direction is T 2 The distance between the center point of the reference mark and the origin O in the y-axis direction is S 2 The method comprises the steps of carrying out a first treatment on the surface of the After the second depth image and the second detection image are subjected to image fusion, a second fusion image can be obtained, and based on the second fusion image, a laser projection point P can be obtained 2 Distance x from origin O on x-axis 2 The method comprises the following steps: x is x 2 =T 2 -L 2 Laser projection point P 2 Distance y from origin O on y-axis 2 The method comprises the following steps: y is 2 =V 2 -S 2
In some embodiments, when x is obtained 1 And x 2 Then, the incident angle θ of the first laser source 31 can be obtained based on the trigonometric function relationship 1 . As shown in fig. 8, the rotation pivot point of the first laser source 31 is denoted as e, and the laser projection point P 2 The projection points on the detector 4 are denoted as d, p 2 The intersection point of the connecting line between d and the first position calibration plate 13 on the laser projection plane is c, and then the following can be obtained: side length p 1 c=(T 1 -L 1 )-(T 2 -L 2 ) The method comprises the steps of carrying out a first treatment on the surface of the Side length p 2 c=H 2 -H 1 From the trigonometric function, it is possible to:
Figure BDA0003981016450000051
/>
and similarly, can be obtained by 1 And y 2 Then, the incident angle θ of the second laser source 32 can be obtained based on the trigonometric function relationship 2
Step S105: and obtaining the position mapping relation between the laser module and the detector according to the second position relation, the second depth information and the incident angle.
In some embodiments, the angle of incidence θ of the first laser source 31 is obtained 1 And then, calculating the position mapping relation between the laser module and the detector through a trigonometric function relation.
In some embodiments, the position mapping relationship between the first laser source 31 and the detector may be represented by the position mapping relationship between the rotation pivot 12 and the center point q of the detector 4, or may be represented by the position mapping relationship between the rotation pivot 12 and other positions on the detector 4, such as one of the vertex angles of the detector, where the position mapping relationship between the first laser source 31 and the detector may be represented by the position mapping relationship between the rotation pivot 12 and the center point q of the detector 4, as shown in fig. 8, where the projection point of the center point q of the detector 4 on the calibration plate 13 at the first position is O 1 The projection point on the calibration plate 13 at the second position is O 2 In FIGS. 10-11, O 1 Coinciding with the origin O, the following can be obtained: in right triangle p 2 In de, p 2 d=H 2 +h,de=(H 2 +h)tanθ 1 。p 2 O 2 =T 2 -L 2 The position mapping relation M between the rotation pivot 12 and the center point q of the detector 4 can be obtained 1 The method comprises the following steps:
M 1 =de+(T 2 -L 2 )
the incident angle θ of the first laser source 31 is obtained based on the method shown in fig. 5 1 Is used for completing the position mapping relation M of the first laser source 31 and the detector 4 1 After calibration of (a), the incident angle θ of the second laser source 32 can be obtained by referring to the same method 2 Is used for completing the position mapping relation M between the second laser source 32 and the detector 4 2 Thereby completing the calibration of the laser navigation of the X-ray machine.
According to the laser navigation calibration method provided by the embodiment of the application, the position and angle calibration of the laser module can be completed through the calibration plate and the depth camera, so that fewer tools are required, and the calibration cost is low; the calibration plate used in the embodiment of the application is provided with the mark which can be detected by the detector, and based on the trigonometric function relation between the depth image shot by the depth camera and the detection image of the detector, the laser module is calibrated rapidly before operation, and the popularization of laser navigation in the orthopedic operation is facilitated.
Since the foregoing embodiments are all described in other modes by reference to the above, the same parts are provided between different embodiments, and the same and similar parts are provided between the embodiments in the present specification. And will not be described in detail herein.
The above embodiments of the present application are not intended to limit the scope of the present application.

Claims (10)

1. A laser navigation calibration method, which is characterized by being used for perspective imaging equipment, wherein the perspective imaging equipment comprises a detector and a laser module, the laser module is installed on the detector, and a depth camera is also installed on the detector, and the method comprises the following steps:
fixing a calibration plate in parallel at a first position spaced a first distance from the detector, wherein the calibration plate is provided with a mark which can be detected by the detector;
controlling the laser module to obliquely project laser to the calibration plate, shooting a first depth image of the calibration plate through the depth camera, and obtaining a first detection image of the calibration plate through the detector;
fixing the calibration plate in a second position which is separated from the detector by a second distance, wherein the second distance is different from the first distance, the second depth image of the calibration plate is shot by the depth camera, and a second detection image of the calibration plate is obtained by the detector;
calculating an incident angle of the laser module according to a first position relation between the marked center point and the laser projection point in the first depth image, first depth information of the calibration plate, a second position relation between the marked center point and the laser projection point in the second depth image, and second depth information of the calibration plate, and obtaining an angle calibration relation between the laser module and the detector, wherein the marked center point is obtained according to the first detection image or the second detection image;
and obtaining the position mapping relation between the laser module and the detector according to the second position relation, the second depth information and the incident angle.
2. The laser navigation calibration method according to claim 1, wherein the mark is provided on a top corner of the calibration plate.
3. The laser navigation calibration method according to claim 1, wherein four cross lead marks which are symmetrical in center around the center point of the calibration plate are arranged on the calibration plate, and the cross lead marks are the marks.
4. The laser navigation calibration method according to claim 1, wherein calculating the incident angle of the laser module according to the first positional relationship between the marked center point and the laser projection point in the first depth image and the first depth information of the calibration plate, the second positional relationship between the marked center point and the laser projection point in the second depth image and the second depth information of the calibration plate includes:
fusing the first depth image and the first detection image to obtain a first fused image;
fusing the second depth image and the second detection image to obtain a second fused image;
establishing a coordinate system in the first fusion image and the second fusion image respectively, wherein the origin of coordinates of the coordinate system is set as the center point of the first fusion image or the second fusion image, and the center points of the first fusion image and the second fusion image are the projections of the center point of the detector;
and calculating the incident angle of the laser module through a trigonometric function based on the coordinate system, the first depth information and the second depth information of the calibration plate, the first position relation between the marked center point and the laser projection point in the first depth image and the second position relation between the marked center point and the laser projection point in the second depth image.
5. The method of claim 4, wherein the mark is a cross lead mark, an x-axis of the coordinate system is parallel to one side of the cross lead mark, and a y-axis of the coordinate system is parallel to the other side of the cross lead mark.
6. The method of claim 4, wherein obtaining the positional mapping between the laser module and the detector according to the second positional relationship, the second depth information, and the incident angle comprises:
and calculating the distance between the position of the laser module on the detector and the center point of the detector through a trigonometric function relation based on the second position relation, the second depth information and the incident angle, and determining the distance and the incident angle as the position mapping relation of the laser module and the detector.
7. The laser navigation calibration method of claim 1, wherein the depth camera is flush with a plane of the detector toward the detection plate.
8. The laser navigation calibration method according to claim 1, wherein the perspective imaging device comprises a display electrically connected to a detector for displaying the first detected image and the second detected image obtained by the detector.
9. The laser navigation calibration method according to claim 1, wherein the laser module comprises a first laser source and a second laser source, the first laser source and the second laser source being distributed on adjacent and perpendicular sides of the detector.
10. The laser navigation calibration method of claim 1, wherein the depth camera is flush with a plane of the detector toward the detection plate.
CN202211550935.8A 2022-12-05 2022-12-05 Laser navigation calibration method Pending CN116138878A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211550935.8A CN116138878A (en) 2022-12-05 2022-12-05 Laser navigation calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211550935.8A CN116138878A (en) 2022-12-05 2022-12-05 Laser navigation calibration method

Publications (1)

Publication Number Publication Date
CN116138878A true CN116138878A (en) 2023-05-23

Family

ID=86338001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211550935.8A Pending CN116138878A (en) 2022-12-05 2022-12-05 Laser navigation calibration method

Country Status (1)

Country Link
CN (1) CN116138878A (en)

Similar Documents

Publication Publication Date Title
US20180367765A1 (en) Medical tracking system comprising two or more communicating sensor devices
EP1942662B1 (en) Automatic improvement of tracking data for intraoperative C-arm images in image guided surgery
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
US11270465B2 (en) Multiple camera calibration
US20230329799A1 (en) Rotating Marker
TWI624243B (en) Surgical navigation system and instrument guiding method thereof
JP2016516196A (en) Structured optical scanner correction tracked in 6 degrees of freedom
CN101194845A (en) Synchronous real time dynamic tracing solid locating method and device based on dual-X-ray machine
WO2023134237A1 (en) Coordinate system calibration method, apparatus and system for robot, and medium
US20220175461A1 (en) Systems and methods for aiming and aligning of a treatment tool within an x-ray device or an ultrasound device environment
WO2013053103A1 (en) Determination method and calibration tool for directional calibration parameters and action direction of surgical instrument
JP2000205821A (en) Instrument and method for three-dimensional shape measurement
CN116138878A (en) Laser navigation calibration method
CN112998856A (en) Three-dimensional marking device and three-dimensional real-time positioning method for orthopedic surgery
JP4969279B2 (en) Position detection method and position detection apparatus
US11918406B2 (en) Marker registration correction by virtual model manipulation
CN111707446B (en) Method and system for adjusting alignment of light spot center and detector receiving surface center
JP2010266750A (en) Observation device and observation system
KR101544712B1 (en) Calibration method for surgical navigation systems
US20230248467A1 (en) Method of medical navigation
US20230125959A1 (en) Endoscopic camera arrangement and method for camera alignment error correction
KR102612603B1 (en) 2d-3d image registraion method and medical operating robot system thereof
CN117462267B (en) Aiming method of robot end effector under perspective guidance
CN115568946B (en) Lightweight navigation positioning system, method and medium for oral and throat surgery
Jeung et al. Intraoperative zoom lens calibration for high magnification surgical microscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination