CN117115401A - AR navigation data processing method, system and storage medium - Google Patents

AR navigation data processing method, system and storage medium Download PDF

Info

Publication number
CN117115401A
CN117115401A CN202311333906.0A CN202311333906A CN117115401A CN 117115401 A CN117115401 A CN 117115401A CN 202311333906 A CN202311333906 A CN 202311333906A CN 117115401 A CN117115401 A CN 117115401A
Authority
CN
China
Prior art keywords
data
reference plate
side reference
coordinate system
mobile phone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311333906.0A
Other languages
Chinese (zh)
Other versions
CN117115401B (en
Inventor
梁腾龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Calvin Technology Co ltd
Original Assignee
Shenzhen Calvin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Calvin Technology Co ltd filed Critical Shenzhen Calvin Technology Co ltd
Priority to CN202311333906.0A priority Critical patent/CN117115401B/en
Publication of CN117115401A publication Critical patent/CN117115401A/en
Application granted granted Critical
Publication of CN117115401B publication Critical patent/CN117115401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C1/00Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
    • A61C1/08Machine parts specially adapted for dentistry
    • A61C1/082Positioning or guiding, e.g. of drills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C1/00Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
    • A61C1/08Machine parts specially adapted for dentistry
    • A61C1/082Positioning or guiding, e.g. of drills
    • A61C1/084Positioning or guiding, e.g. of drills of implanting tools
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C8/00Means to be fixed to the jaw-bone for consolidating natural teeth or for fixing dental prostheses thereon; Dental implants; Implanting tools
    • A61C8/0089Implanting tools or instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image

Abstract

The invention provides an AR navigation data processing method, an AR navigation data processing system and a storage medium, wherein the AR navigation data processing method comprises the following steps: establishing communication data connection between a data processing terminal and an AR display terminal; acquiring preset target planting scheme data; establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system through registration and calibration; acquiring real-time pose data of a patient and acquiring real-time pose data of a mobile phone; analyzing to obtain actual planting point position data, actual planting angle data and actual planting depth data; sequentially comparing to obtain point position data deviation, angle data deviation and depth data deviation; and displaying the point position navigation parameters, the angle navigation parameters and the depth navigation parameters through the AR display terminal. Therefore, doctors can concentrate on carrying out operation according to navigation operation details based on the real-time condition of the oral cavity of the patient, and the ideal effect of dental implant treatment is ensured.

Description

AR navigation data processing method, system and storage medium
Technical Field
The invention relates to the technical field of medical treatment, in particular to a method, a system and a storage medium for navigation data processing based on AR equipment in the process of performing implantation surgery.
Background
When the teeth of the oral cavity of a patient are missing, the oral cavity of the patient can not only have negative effects on the facial appearance, but also can have great limitation on the chewing function. When the teeth are missing, people usually adopt an oral implantation mode for treatment. The oral cavity implantation refers to a treatment mode of repairing missing teeth in the oral cavity of a patient by means of dental implantation, after drilling holes at the alveolar bone corresponding to the missing teeth, an implant is inserted into the alveolar bone, then a base and a dental crown are fixedly installed, and the functions of attractive appearance, chewing and the like are realized by replacing the missing teeth through an integral structure formed by the implant, the base and the dental crown.
In order to ensure the implantation accuracy, the clinical operation of drilling is usually performed by means of an optical navigation system in the process of performing the dental implant operation, wherein the optical navigation system at least comprises an optical positioner, an implantation mobile phone, a human body reference plate, a mobile phone reference plate, a data processing device, a display device and the like: the human body reference plate is fixedly arranged on a patient, the mobile phone reference plate is fixedly arranged on the planting mobile phone, the optical positioning instrument is used for determining the real-time relative position of the patient and the planting mobile phone by collecting the human body reference plate and the mobile phone reference plate, and the drilling operation details are determined by the data processing equipment and then displayed on the display equipment so as to be conveniently guided in real time.
In the operation process, doctors must pay attention to the real-time condition of the oral cavity of the patient, and based on the navigation data processing mode in the prior art, the doctors have to pay attention to the details of navigation operation.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an AR navigation data processing method, an AR navigation data processing system and a storage medium aiming at the defects in the prior art, so that doctors can concentrate more on carrying out operations based on the real-time condition of the oral cavity of a patient and according to navigation operation details by turning over the traditional navigation data display mode, the convenience of the implantation operation is greatly improved, and the ideal effect of dental implant treatment is ensured.
The technical scheme adopted for solving the technical problems is as follows:
an AR navigation data processing method, comprising the steps of:
s1, establishing communication data connection between a data processing terminal and an AR display terminal, wherein the AR display terminal is arranged on an eyeglass;
s2, acquiring preset target planting scheme data, wherein the target planting scheme data comprise target planting point position data, target planting angle data and target planting depth data;
s3, registering the CT, the patient side reference plate and the optical positioning instrument to obtain a matrix conversion relation among a CT coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
S4, calibrating the patient side reference plate and the mobile phone side reference plate to obtain a matrix conversion relation between a patient side coordinate system and a mobile phone side coordinate system;
s5, establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
s6, acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of a patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring the real-time pose data of a mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to a matrix conversion relation among a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
s7, analyzing based on the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data;
s8, sequentially comparing actual planting point position data with target planting point position data according to the sequence to obtain point position data deviation, comparing actual planting angle data with target planting angle data to obtain angle data deviation, and comparing actual planting depth data with target planting depth data to obtain depth data deviation;
S9, transmitting the point position data deviation, the angle data deviation and the depth data deviation to an AR display terminal, and displaying the point position navigation parameter, the angle navigation parameter and the depth navigation parameter through the AR display terminal.
Compared with the prior art, the beneficial effects of the technical scheme are as follows: the method comprises the steps of acquiring target planting data and actual planting data in real time, sequentially comparing and analyzing angle data deviation, comparing actual planting depth data with target planting depth data to obtain depth data deviation, and displaying the depth data deviation in an operation mode through an AR display terminal, wherein a doctor only needs to switch eye focuses between a patient oral cavity and an ophthalmic lens, and can pay attention to details of oral cavity and navigation operation of the patient, so that the convenience of planting operation can be greatly improved, and ideal effect of dental implant treatment is ensured.
Correspondingly, an AR navigation data processing system, comprising:
the communication connection module is used for establishing communication data connection between the data processing terminal and the AR display terminal, and the AR display terminal is arranged on the spectacle lens;
the scheme acquisition module is used for acquiring preset target planting scheme data, wherein the target planting scheme data comprises target planting point position data, target planting angle data and target planting depth data;
The registration module is used for registering the CT, the patient side reference plate and the optical positioning instrument and acquiring a matrix conversion relation among a CT coordinate system, a patient side coordinate system and the optical positioning instrument coordinate system;
the calibration module is used for calibrating the patient side reference plate and the mobile phone side reference plate to acquire a matrix conversion relation between a patient side coordinate system and a mobile phone side coordinate system;
the matrix operation module is used for establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
the data acquisition module is used for acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of the patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring real-time pose data of the mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to the matrix conversion relation among the CT coordinate system, the optical positioning instrument coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system;
the data analysis module is used for analyzing the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data;
The comparison module is used for sequentially comparing the actual planting point position data with the target planting point position data to obtain point position data deviation, comparing the actual planting angle data with the target planting angle data to obtain angle data deviation, and comparing the actual planting depth data with the target planting depth data to obtain depth data deviation;
the data transmission module is used for transmitting the point position data deviation, the angle data deviation and the depth data deviation to the AR display terminal, and displaying the point position navigation parameters, the angle navigation parameters and the depth navigation parameters through the AR display terminal.
Correspondingly, a storage medium storing a computer program comprising program instructions which, when executed by a processor, perform an AR navigation data processing method as described above.
Drawings
FIG. 1 is a flow chart of an AR navigation data processing method of the present invention.
FIG. 2 is a schematic diagram of the structure of the AR navigation data processing system of the present invention.
In the drawings, the list of components represented by the respective reference numerals is as follows:
the system comprises a communication connection module 1, a scheme acquisition module 2, a registration module 3, a calibration module 4, a matrix operation module 5, a data acquisition module 6, a data analysis module 7, a comparison module 8 and a data transmission module 9.
Description of the embodiments
In order to make the objects, technical solutions and advantages of the present invention more clear and clear, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the description of the present invention, it should be understood that the directions or positional relationships indicated by the terms "center", "upper", "lower", "front", "rear", "left", "right", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the apparatus or component to be referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of the two components. When an element is referred to as being "mounted" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. It will be understood by those of ordinary skill in the art that the terms described above are in the specific sense of the present invention.
When the teeth of the oral cavity of a patient are missing, the oral cavity of the patient can not only have negative effects on the facial appearance, but also can have great limitation on the chewing function. When the teeth are missing, people usually adopt an oral implantation mode for treatment. The oral cavity implantation refers to a treatment mode of repairing missing teeth in the oral cavity of a patient by means of dental implantation, after drilling holes at the alveolar bone corresponding to the missing teeth, an implant is inserted into the alveolar bone, then a base and a dental crown are fixedly installed, and the functions of attractive appearance, chewing and the like are realized by replacing the missing teeth through an integral structure formed by the implant, the base and the dental crown.
In order to ensure the implantation accuracy, the clinical operation of drilling is usually performed by means of an optical navigation system in the process of performing the dental implant operation, wherein the optical navigation system at least comprises an optical positioner, an implantation mobile phone, a human body reference plate, a mobile phone reference plate, a data processing device, a display device and the like: the human body reference plate is fixedly arranged on a patient, the mobile phone reference plate is fixedly arranged on the planting mobile phone, the optical positioning instrument is used for determining the real-time relative position of the patient and the planting mobile phone by collecting the human body reference plate and the mobile phone reference plate, and the drilling operation details are determined by the data processing equipment and then displayed on the display equipment so as to be conveniently guided in real time.
In the operation process, doctors need to pay attention to the real-time condition of the oral cavity of a patient, an electronic computer is adopted as data processing equipment, a display screen is adopted as display equipment in the prior art, data from an optical positioning instrument, a human body reference plate, a mobile phone reference plate and the like are subjected to operation processing of the electronic computer to obtain navigation operation details, and the navigation operation details are displayed through the display screen. Based on the navigation data processing mode in the prior art, a doctor has to pay attention to navigation operation details in the operation process, on one hand, the real-time condition of the oral cavity needs to be observed as much as possible, and on the other hand, the navigation operation details on the display screen need to be observed as timely as possible so as not to miss the optimal operation path. The doctor is required to continuously switch the visual direction in the data processing mode, and the vision can not be kept motionless in the operation process, so that great inconvenience is brought to the operation process, and the dental implant treatment effect is not guaranteed.
As shown in fig. 1, in order to solve the above-mentioned problem, the present technical solution proposes an AR navigation data processing method, which includes the following steps:
s1, establishing communication data connection between a data processing terminal and an AR display terminal, wherein the AR display terminal is arranged on an eyeglass lens. Before step S1 is implemented, preparation is required at the hardware level, and a doctor wears AR display terminals such as AR glasses in order to display details of navigation operations through the glasses lens during surgery. After wearing, communication data connection can be established between the data processing terminal and the AR display terminal, navigation operation details are obtained through calculation in operation by the data processing terminal based on the acquired real-time data, and the navigation operation details are transmitted to the AR display terminal for display through the communication data connection.
S2, acquiring preset target planting scheme data, wherein the target planting scheme data comprise target planting point position data, target planting angle data and target planting depth data. In step S2, the target planting scheme data can be understood as the most ideal planting scheme, and the preoperative doctor makes a corresponding planting scheme based on the actual condition of the oral cavity of the patient, and determines planting point position data, planting angle data and planting depth data, namely, where to drill holes, in which direction and how deep to drill holes. Theoretically, the most ideal planting treatment effect can be obtained based on the target planting scheme data.
S3, registering the CT, the patient side reference plate and the optical positioning instrument to obtain a matrix conversion relation among the CT coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system. The main purpose of step S3 is to perform registration, before registration, the three-dimensional coordinates of the points between the CT, the patient side reference plate and the optical positioner are different, the coordinate value on one of them cannot be applied to the other two, and step S3 is to obtain the coordinate value on one or more of the CT coordinate system, the patient side coordinate system and the optical positioner coordinate system by obtaining the matrix conversion relationship between the CT coordinate system and the patient side coordinate system, the matrix conversion relationship between the patient side coordinate system and the optical positioner coordinate system, and the matrix conversion relationship between the CT coordinate system and the optical positioner coordinate system, and calculating based on the coordinate value of any point according to the matrix conversion relationship.
S4, calibrating the patient side reference plate and the mobile phone side reference plate, and obtaining a matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system. The main purpose of the step S4 is to perform calibration, and the matrix conversion relationship between the patient side coordinate system and the mobile phone side coordinate system can be obtained through calibration.
S5, establishing a matrix conversion relation among any two of the CT coordinate system, the optical locator coordinate system, the patient side coordinate system and the optical locator coordinate system. Based on the matrix conversion relationship established in step S3 and step S4, a matrix conversion relationship can be established between any two of the CT coordinate system, the optical locator coordinate system, the patient side coordinate system and the optical locator coordinate system. The coordinate values of a certain point on a certain coordinate system are obtained by calculating through the corresponding matrix conversion relation, and the coordinate values of the point on the CT coordinate system, the optical positioning instrument coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system are obtained.
S6, acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of the patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring the real-time pose data of the mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to a matrix conversion relation among the CT coordinate system, the optical positioning instrument coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system. In step S6, in the process of implementing the dental implant surgery, the real-time infrared imaging data of the patient side reference plate and the real-time infrared imaging data of the mobile phone side reference plate are acquired through the optical positioning instrument, in the process, the real-time coordinate values of the patient side reference plate and the mobile phone side reference plate relative to the optical positioning instrument coordinate system can be acquired, and then the real-time pose of the patient side reference plate and the mobile phone side reference plate can be analyzed based on the matrix conversion relationship, so that the real-time pose of the patient and the implant mobile phone can be obtained.
S7, analyzing based on the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data. In step S7, the real-time relative position between the patient and the planting mobile phone can be obtained through the real-time pose data of the patient and the real-time pose data of the mobile phone, and the actual planting point position data, the actual planting angle data and the actual planting depth data at a certain moment can be known based on the real-time relative position of the patient and the planting mobile phone, namely, where the planting mobile phone drills, in which direction and in which depth the planting mobile phone drills.
S8, sequentially comparing actual planting point position data with target planting point position data according to the sequence to obtain point position data deviation, comparing actual planting angle data with target planting angle data to obtain angle data deviation, and comparing actual planting depth data with target planting depth data to obtain depth data deviation. Step S8 is one of the core of the technical scheme, the target planting point position data, the target planting angle data and the target planting depth data are acquired in step S2, the actual planting point position data, the actual planting angle data and the actual planting depth data are acquired in step S7, step S8 is based on the two groups of data for comparison, and through comparison, whether the drilling position of the planting mobile phone is correct is firstly determined, then whether the drilling direction of the planting mobile phone is correct is determined, finally whether the drilling depth of the planting mobile phone is correct is determined, and comparison analysis is sequentially and gradually carried out in operation so as to facilitate accurate guiding in operation.
S9, transmitting the point position data deviation, the angle data deviation and the depth data deviation to an AR display terminal, and displaying the point position navigation parameter, the angle navigation parameter and the depth navigation parameter through the AR display terminal. The point location navigation parameter, the angle navigation parameter and the depth navigation parameter can be respectively understood as point location data deviation, angle data deviation and depth data deviation, i.e. how large the existing deviation value is intuitively displayed to a doctor and how to overcome the deviation value. In the step S1, communication data connection is established between the data processing terminal and the AR display terminal, and in the step S9, the traditional Chinese medicine can finish drilling operation according to navigation only by sequentially adjusting planting points, planting directions and controlling planting depths according to the comparison analysis result of the step S8.
Based on the technical scheme, in the operation process, when a doctor needs to pay attention to the oral cavity of a patient, the eye focus is adjusted to be focused in the oral cavity of the patient, when the doctor needs to pay attention to the navigation operation details, the eye focus is adjusted to be focused on the glasses lens of the AR glasses, and the whole process can keep the sight motionless. Through real-time acquisition target planting data and actual planting data, comparison analysis angle data deviation, comparison actual planting depth data and target planting depth data obtain the depth data deviation and show in the art through AR display terminal in proper order, and the doctor only need switch the eye focus between patient's oral cavity and spectacle lens, can pay attention to patient's oral cavity and navigation operation details, can greatly improve planting operation convenience, ensures that the tooth implantation treatment obtains ideal effect.
Preferably, in order to better establish a communication data connection between the data processing terminal and the AR display terminal, the step S1 specifically includes:
s101, respectively controlling the data processing terminal and the AR display terminal to access to the same local network. In step S101, the control data processing terminal and the AR display terminal are connected to the same local WiFi network, so that the data processing terminal and the AR display terminal are connected through the WiFi network.
S102, acquiring IP address information of the AR display terminal. In step S102, the IP address information of the AR display terminal in the WiFi network is obtained.
S103, the IP address information of the AR display terminal is sent to the data processing terminal, and the data processing terminal is controlled to establish communication data connection with the AR display terminal through the IP address information of the AR display terminal. In step S103, based on the IP address information of the AR display terminal in the WiFi network, the data processing terminal is controlled to establish a communication data connection with the AR display terminal, so far, the communication data connection is established, and the data processing terminal can send real-time data to the AR display terminal through the WiFi network.
S104, setting a delay test IP address, setting the IP address information of the AR display terminal as a target IP address at the data processing terminal, and setting the IP address information of the data processing terminal as the target IP address at the AR display terminal. Step S104 starts, a delay test is performed on the communication data connection between the data processing terminal and the AR display terminal, and in step S104, on one hand, the IP address information of the AR display terminal is set as a target IP address at the data processing terminal, and on the other hand, the IP address information of the data processing terminal is set as a target IP address at the AR display terminal.
S105, sending the delay test data packet, controlling the data processing terminal to send the delay test data packet to the AR display terminal, and controlling the AR display terminal to send the delay test data packet to the data processing terminal. In step S105, on the one hand, the data processing terminal is controlled to send a delay test data packet to the AR display terminal, and on the other hand, the AR display terminal is controlled to send a delay test data packet to the data processing terminal.
S106, receiving the delay test data packet, controlling the data processing terminal to receive the delay test data packet from the AR display terminal, and controlling the AR display terminal to receive the delay test data packet from the data processing terminal. In step S106, correspondingly, the data processing terminal is controlled to receive the delay test data packet from the AR display terminal on the one hand, and the AR display terminal is controlled to receive the delay test data packet from the data processing terminal on the other hand.
S107, calculating network delay through delay test data packet sending time and delay test data packet receiving time, and judging whether the network transmission speed is qualified or not based on a preset network delay threshold value. In step S107, whether the data processing terminal or the AR display terminal, the network delay can be calculated by the delay test data packet sending time and the delay test data packet receiving time, if the network delay is too large, the network delay is determined to be unqualified, the connection needs to be re-established, and only if the network delay test passes, the further operation can be performed.
Based on the technical scheme, communication data connection is established between the data processing terminal and the AR display terminal, data transmission quality and data transmission speed are guaranteed through test delay, a channel is smooth, navigation operation details are displayed on the AR display terminal in a quick and timely manner, and accurate planting operation is guaranteed.
In addition to establishing communication data connection through a WiFi network, in the technical scheme, communication data connection can be established between the data processing terminal and the AR display terminal through bluetooth. The Bluetooth mode is started and paired on the data processing terminal and the AR display terminal respectively, and delay test is carried out after pairing is completed, so that data transmission can be carried out through Bluetooth, more possibilities are provided for equipment construction, and therefore the applicability of the technical scheme is improved.
Preferably, for better registering the CT, the patient side reference plate and the optical positioner, to obtain a matrix-conversion relationship among the CT coordinate system, the patient side coordinate system and the optical positioner coordinate system, the step S3 specifically includes the steps of:
s301, acquiring coordinate values [ X1, Y1, Z1] of the registration points on the patient side reference plate in the three-dimensional model through the design model. In step S301, a three-dimensional model of the patient-side reference plate is formed at the beginning of its design, and coordinate values [ X1, Y1, Z1] of the alignment points in the three-dimensional model are obtained based on the design model.
S302, coordinate values [ X2, Y2, Z2] of the registration points on the patient side reference plate in the CT image are obtained through CT scanning. In step S302, the patient side reference plate is scanned by the CT apparatus, a CT image of the patient side reference plate may be acquired after scanning, and likewise coordinate values [ X2, Y2, Z2] of the registration points in the CT image may be acquired.
S303, receiving infrared light emitted by the patient side reference plate through the optical positioner to acquire coordinate values [ X3, Y3, Z3] of the alignment point on the patient side reference plate in the optical positioning image. In step S303, after the infrared light emitted by the optical positioner, the patient side reference plate reflects back and is received again by the optical positioner, and the reflected infrared light is used as real-time infrared light imaging data, based on which the coordinate values [ X3, Y3, Z3] of the alignment point on the patient side reference plate in the optical positioning image can be obtained. To this end, the coordinate values [ X1, Y1, Z1] of the registration point on the patient-side reference plate in the three-dimensional model, the coordinate values [ X2, Y2, Z2] in the CT image, and the coordinate values [ X3, Y3, Z3] in the optical positioning image have all been successfully obtained.
S303, aligning the same registration point in the three-dimensional model and the CT image, and calculating based on coordinate values [ X1, Y1, Z1] of the registration point in the three-dimensional model and coordinate values [ X2, Y2, Z2] in the CT image to obtain a matrix conversion relation between a patient side coordinate system and the CT coordinate system. In step S303, a matrix conversion relationship between the patient-side coordinate system and the CT coordinate system is calculated based on the coordinate values of the same alignment point in different coordinate systems.
S304, aligning the same registration point in the optical positioning image and the CT image, and calculating to obtain a matrix conversion relation between the CT coordinate system and the coordinate system of the optical positioning instrument based on coordinate values [ X2, Y2, Z2] of the registration point in the CT image and coordinate values [ X3, Y3, Z3] in the optical positioning image. In step S304, based on the coordinate values of the same alignment point in different coordinate systems, a matrix transformation relationship between the CT coordinate system and the optical locator coordinate system is calculated.
Based on the above technology, after coordinate values of the registration points on the patient side reference plate in the three-dimensional model, the CT image and the optical positioning image are respectively obtained, the matrix conversion relation between the patient side coordinate system and the CT coordinate system and the matrix conversion relation between the CT coordinate system and the optical positioning instrument coordinate system can be obtained through simple conversion analysis.
Preferably, in order to better calibrate the patient side reference plate and the mobile phone side reference plate, a matrix conversion relationship between the patient side coordinate system and the mobile phone side coordinate system is obtained, and the step S4 specifically includes:
s401, controlling the patient side reference plate and the mobile phone side reference plate to be in different pose states respectively. In step S401, it is necessary to move the patient side reference plate and the cell phone side reference plate, respectively, so that they are in different pose states, respectively.
S402, acquiring image data of a patient side reference plate in each pose state, and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the patient side reference plate. Step S402 may be understood as photographing the patient side reference plate in a fixed pose state, acquiring image data of the patient side reference plate, and further analyzing two-dimensional coordinates of an image of a calibration point on the patient side reference plate in the image. And the same operation is carried out on each fixed pose state, so that the two-dimensional image coordinates of the marked point on the patient side reference plate in each pose state can be obtained.
S403, matching image data of the patient side reference plate in a plurality of pose states based on the calibration points on the patient side reference plate, and obtaining the space three-dimensional coordinates of the calibration points on the patient side reference plate by combining with the camera parameter analysis of the optical positioner. In step S403, the camera parameters of the optical positioner are known parameters, and the optical positioner collects image data of the patient side reference plate in a plurality of pose states on one hand, and can analyze and obtain the spatial three-dimensional coordinates of the calibration point on the patient side reference plate based on the camera parameters of the optical positioner on the other hand. It should be noted that this step is a conventional use step of the optical positioner, which is a prior art well known to those skilled in the art, and the innovation point of the present technical solution is not herein.
S404, analyzing to obtain pose data of the patient side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the patient side reference plate. The two-dimensional coordinates of the image of the calibration point are known in step S402, the three-dimensional coordinates of the space of the calibration point are known in step S403, and the pose data of the patient side reference plate can be obtained by analysis according to the image processing technology based on the two-dimensional coordinates and the three-dimensional coordinates of the image of the calibration point.
S405, based on pose data of a patient side reference plate, selecting a specific calibration identification point on the patient side reference plate as an origin and a specific direction as a coordinate axis, and establishing a patient side coordinate system. Step S405 may be understood as a process of customizing a coordinate system, in which a specific calibration identification point is selected as an origin according to structural features of a patient side reference plate, a specific direction is selected as a coordinate axis based on the origin, and the origin determination, the coordinate value determination, and the corresponding patient side coordinate system may be established.
S406, acquiring image data of the mobile phone side reference plate in each pose state, and analyzing to obtain two-dimensional coordinates of the image of the calibration point on the mobile phone side reference plate. Step S406 is similar to step S402 except that step S402 is to analyze the image two-dimensional coordinates of the patient side reference plate from the image data of the patient side reference plate, and step S406 is to obtain the image two-dimensional coordinates of the calibration point on the mobile phone side reference plate from the image data of the mobile phone side reference plate. Similarly, steps S407 to S409 correspond to steps S403 to S405, respectively, and will not be explained here.
S407, matching the image data of the mobile phone side reference plate under a plurality of pose states based on the calibration points on the mobile phone side reference plate, and obtaining the space three-dimensional coordinates of the calibration points on the mobile phone side reference plate by combining with the camera parameter analysis of the optical positioner.
S408, analyzing to obtain pose data of the mobile phone side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the mobile phone side reference plate.
S409, based on pose data of a mobile phone side reference plate, selecting a specific calibration identification point on the mobile phone side reference plate as an origin and a specific direction as a coordinate axis, and establishing a mobile phone side coordinate system.
S410, analyzing to obtain a matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system. In step S410, only the spatial three-dimensional coordinates of the calibration mark point on the patient side coordinate system and the mobile phone side coordinate system are acquired based on the same calibration mark point, and the matrix conversion relationship between the patient side coordinate system and the mobile phone side coordinate system can be obtained through calculation.
Based on the technology, after the patient side coordinate system and the mobile phone side coordinate system are respectively established, the patient side reference plate and the mobile phone side reference plate are calibrated through calibrating the space three-dimensional coordinates of the identification points on the two coordinate systems, and the matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system is obtained.
Preferably, the step S8 specifically includes:
s801, obtaining point position data deviation by comparing actual planting point position data with target planting point position data until the point position data deviation is zero;
s802, obtaining angle data deviation by comparing actual planting angle data with target planting angle data until the angle data deviation is zero;
s803, obtaining depth data deviation by comparing the actual planting depth data with the target planting depth data until the depth data deviation is zero.
The focus of step S8 is on the sequential comparison. Firstly, obtaining point data deviation by comparing actual planting point data with target planting point data, and ensuring that a planting mobile phone punches at a correct position; secondly, obtaining angle data deviation by comparing actual planting angle data with target planting angle data, and ensuring that the planting mobile phone punches at a correct angle; finally, the actual planting depth data and the target planting depth data are compared to obtain depth data deviation, the drilling depth is monitored in real time, and the reasonable drilling depth of the planting mobile phone is ensured.
Based on the technical scheme, according to the implementation process of the dental implant operation, the drilling point positions are determined firstly, the angles are adjusted after the point positions are determined, the drilling depth is monitored in real time after the angles are confirmed to be correct, and the operation is stopped immediately after the depth reaches the requirement, so that the drilling point positions, the angles and the depth are ensured to meet the requirement, and a better treatment effect is achieved.
Correspondingly, as shown in fig. 2, in order to solve the above-mentioned problem, the present technical solution proposes an AR navigation data processing system, which includes a communication connection module 1, a solution acquisition module 2, a registration module 3, a calibration module 4, a matrix operation module 5, a data acquisition module 6, a data analysis module 7, a comparison module 8, and a data transmission module 9.
Wherein: the communication connection module is used for establishing communication data connection between the data processing terminal and the AR display terminal, and the AR display terminal is arranged on the spectacle lens; the scheme acquisition module is used for acquiring preset target planting scheme data, wherein the target planting scheme data comprises target planting point position data, target planting angle data and target planting depth data; the registration module is used for registering the CT, the patient side reference plate and the optical positioning instrument and acquiring a matrix conversion relation among a CT coordinate system, a patient side coordinate system and the optical positioning instrument coordinate system; the calibration module is used for calibrating the patient side reference plate and the mobile phone side reference plate to acquire a matrix conversion relation between a patient side coordinate system and a mobile phone side coordinate system; the matrix operation module is used for establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system; the data acquisition module is used for acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of the patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring real-time pose data of the mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to the matrix conversion relation among the CT coordinate system, the optical positioning instrument coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system; the data analysis module is used for analyzing the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data; the comparison module is used for sequentially comparing the actual planting point position data with the target planting point position data to obtain point position data deviation, comparing the actual planting angle data with the target planting angle data to obtain angle data deviation, and comparing the actual planting depth data with the target planting depth data to obtain depth data deviation; the data transmission module is used for transmitting the point position data deviation, the angle data deviation and the depth data deviation to the AR display terminal, and displaying the point position navigation parameters, the angle navigation parameters and the depth navigation parameters through the AR display terminal.
Preferably, the communication connection module comprises a network access unit, an IP acquisition unit, a communication connection unit, a delay address setting unit, a delay data transmitting unit, a delay data receiving unit and a delay calculating unit.
Specifically: the network access unit is used for respectively controlling the data processing terminal and the AR display terminal to access the same local network; the IP acquisition unit is used for acquiring the IP address information of the AR display terminal; the communication connection unit is used for sending the IP address information of the AR display terminal to the data processing terminal, and controlling the data processing terminal to establish communication data connection with the AR display terminal through the IP address information of the AR display terminal; the delay address setting unit is used for setting a delay test IP address, setting the IP address information of the AR display terminal as a target IP address at the data processing terminal, and setting the IP address information of the data processing terminal as the target IP address at the AR display terminal; the delay data transmitting unit is used for transmitting the delay test data packet, controlling the data processing terminal to transmit the delay test data packet to the AR display terminal, and controlling the AR display terminal to transmit the delay test data packet to the data processing terminal; the delay data receiving unit is used for receiving the delay test data packet, controlling the data processing terminal to receive the delay test data packet from the AR display terminal, and controlling the AR display terminal to receive the delay test data packet from the data processing terminal; the delay calculating unit is used for calculating network delay through the delay test data packet sending time and the delay test data packet receiving time, and judging whether the network transmission speed is qualified or not based on a preset network delay threshold value.
Preferably, the registration module includes a first coordinate value acquisition unit, a second coordinate value acquisition unit, a third coordinate value acquisition unit, a first matrix operation unit, and a second matrix operation unit. Specifically: a first coordinate value acquisition unit for acquiring coordinate values [ X1, Y1, Z1] of the registration point on the patient-side reference plate in the three-dimensional model by the design model; a second coordinate value acquisition unit for acquiring coordinate values [ X2, Y2, Z2] of the registration point on the patient side reference plate in the CT image by CT scanning; the third coordinate value acquisition unit is used for receiving infrared light emitted by the patient side reference plate through the optical positioner to acquire coordinate values [ X3, Y3, Z3] of the alignment point on the patient side reference plate in the optical positioning image; the first matrix operation unit is used for aligning the three-dimensional model with the same registration point in the CT image, and calculating to obtain a matrix conversion relation between a patient side coordinate system and a CT coordinate system based on coordinate values [ X1, Y1, Z1] of the registration point in the three-dimensional model and coordinate values [ X2, Y2, Z2] in the CT image; and the second matrix operation unit is used for aligning the same registration point in the optical positioning image and the CT image, and calculating to obtain a matrix conversion relation between the CT coordinate system and the coordinate system of the optical positioning instrument based on the coordinate values [ X2, Y2, Z2] of the registration point in the CT image and the coordinate values [ X3, Y3, Z3] in the optical positioning image.
Preferably, the calibration module comprises a pose adjusting unit, a plane coordinate analyzing unit, a space coordinate analyzing unit, a first pose analyzing unit, a first coordinate system establishing unit, a real-time two-dimensional coordinate acquiring unit, a real-time three-dimensional coordinate acquiring unit, a second pose analyzing unit, a second coordinate system establishing unit and a calibration unit.
Specifically: the pose adjusting unit is used for controlling the patient side reference plate and the mobile phone side reference plate to be in different pose states respectively; the plane coordinate analysis unit is used for acquiring image data of the patient side reference plate in each pose state and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the patient side reference plate; the space coordinate analysis unit is used for matching the image data of the patient side reference plate in a plurality of pose states based on the calibration points on the patient side reference plate, and combining the camera parameter analysis of the optical positioning instrument to obtain the space three-dimensional coordinates of the calibration points on the patient side reference plate; the first pose analysis unit is used for analyzing and obtaining pose data of the patient side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the patient side reference plate; the first coordinate system establishing unit is used for establishing a patient side coordinate system by selecting a specific calibration identification point on the patient side reference plate as an origin and selecting a specific direction as a coordinate axis based on pose data of the patient side reference plate; the real-time two-dimensional coordinate acquisition unit is used for acquiring image data of the mobile phone side reference plate in each pose state and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the mobile phone side reference plate; the real-time three-dimensional coordinate acquisition unit is used for matching the image data of the mobile phone side reference plate in a plurality of pose states based on the calibration points on the mobile phone side reference plate, and obtaining the spatial three-dimensional coordinates of the calibration points on the mobile phone side reference plate by combining the camera parameter analysis of the optical positioning instrument; the second pose analysis unit is used for analyzing pose data of the mobile phone side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the mobile phone side reference plate; the second coordinate system establishing unit is used for establishing a mobile phone side coordinate system by selecting a specific calibration identification point on the mobile phone side reference plate as an origin and selecting a specific direction as a coordinate axis based on pose data of the mobile phone side reference plate; and the calibration unit is used for analyzing and obtaining a matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system.
Preferably, the comparison module comprises a first deviation comparison unit, a second deviation comparison unit and a third deviation comparison unit.
Specifically, the first deviation comparison unit is used for obtaining point location data deviation by comparing actual planting point location data with target planting point location data until the point location data deviation is zero; the second deviation comparison unit is used for obtaining angle data deviation by comparing the actual planting angle data with the target planting angle data until the angle data deviation is zero; and the third deviation comparison unit is used for obtaining the depth data deviation by comparing the actual planting depth data with the target planting depth data until the depth data deviation is zero.
Correspondingly, a storage medium storing a computer program comprising program instructions which, when executed by a processor, perform an AR navigation data processing method as described above.
It is to be understood that the invention is not limited in its application to the examples described above, but is capable of modification and variation in light of the above teachings by those skilled in the art, and that all such modifications and variations are intended to be included within the scope of the appended claims.

Claims (10)

1. An AR navigation data processing method, comprising the steps of:
s1, establishing communication data connection between a data processing terminal and an AR display terminal, wherein the AR display terminal is arranged on an eyeglass;
s2, acquiring preset target planting scheme data, wherein the target planting scheme data comprise target planting point position data, target planting angle data and target planting depth data;
s3, registering the CT, the patient side reference plate and the optical positioning instrument to obtain a matrix conversion relation among a CT coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
s4, calibrating the patient side reference plate and the mobile phone side reference plate to obtain a matrix conversion relation between a patient side coordinate system and a mobile phone side coordinate system;
s5, establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
s6, acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of a patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring the real-time pose data of a mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to a matrix conversion relation among a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
S7, analyzing based on the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data;
s8, sequentially comparing actual planting point position data with target planting point position data according to the sequence to obtain point position data deviation, comparing actual planting angle data with target planting angle data to obtain angle data deviation, and comparing actual planting depth data with target planting depth data to obtain depth data deviation;
s9, transmitting the point position data deviation, the angle data deviation and the depth data deviation to an AR display terminal, and displaying the point position navigation parameter, the angle navigation parameter and the depth navigation parameter through the AR display terminal.
2. The AR navigation data processing method according to claim 1, wherein the step S1 specifically includes:
s101, respectively controlling a data processing terminal and an AR display terminal to access the same local network;
s102, acquiring IP address information of an AR display terminal;
s103, the IP address information of the AR display terminal is sent to the data processing terminal, and the data processing terminal is controlled to establish communication data connection with the AR display terminal through the IP address information of the AR display terminal;
s104, setting a delay test IP address, setting the IP address information of the AR display terminal as a target IP address at the data processing terminal, and setting the IP address information of the data processing terminal as the target IP address at the AR display terminal;
S105, sending a delay test data packet, controlling the data processing terminal to send the delay test data packet to the AR display terminal, and controlling the AR display terminal to send the delay test data packet to the data processing terminal;
s106, receiving the delay test data packet, controlling the data processing terminal to receive the delay test data packet from the AR display terminal, and controlling the AR display terminal to receive the delay test data packet from the data processing terminal;
s107, calculating network delay through delay test data packet sending time and delay test data packet receiving time, and judging whether the network transmission speed is qualified or not based on a preset network delay threshold value.
3. The AR navigation data processing method according to claim 1, wherein the step S3 specifically includes the steps of:
s301, acquiring coordinate values [ X1, Y1, Z1] of a registration point on a patient side reference plate in a three-dimensional model through a design model;
s302, acquiring coordinate values [ X2, Y2, Z2] of a registration point on a patient side reference plate in a CT image through CT scanning;
s303, receiving infrared light emitted by a patient side reference plate through an optical positioner to obtain coordinate values [ X3, Y3, Z3] of a registration point on the patient side reference plate in an optical positioning image;
S303, aligning the three-dimensional model with the same registration point in the CT image, and calculating to obtain a matrix conversion relation between a patient side coordinate system and a CT coordinate system based on coordinate values [ X1, Y1, Z1] of the registration point in the three-dimensional model and coordinate values [ X2, Y2, Z2] in the CT image;
s304, aligning the same registration point in the optical positioning image and the CT image, and calculating to obtain a matrix conversion relation between the CT coordinate system and the coordinate system of the optical positioning instrument based on coordinate values [ X2, Y2, Z2] of the registration point in the CT image and coordinate values [ X3, Y3, Z3] in the optical positioning image.
4. The AR navigation data processing method according to claim 1, wherein the step S4 specifically includes:
s401, controlling the patient side reference plate and the mobile phone side reference plate to be respectively in different pose states;
s402, acquiring image data of a patient side reference plate in each pose state, and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the patient side reference plate;
s403, matching image data of the patient side reference plate in a plurality of pose states based on the calibration points on the patient side reference plate, and obtaining the space three-dimensional coordinates of the calibration points on the patient side reference plate by combining with the camera parameter analysis of the optical positioner;
S404, analyzing to obtain pose data of the patient side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the patient side reference plate;
s405, based on pose data of a patient side reference plate, selecting a specific calibration mark point on the patient side reference plate as an origin and a specific direction as a coordinate axis, and establishing a patient side coordinate system;
s406, acquiring image data of a mobile phone side reference plate in each pose state, and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the mobile phone side reference plate;
s407, matching the image data of the mobile phone side reference plate under a plurality of pose states based on the calibration points on the mobile phone side reference plate, and obtaining the space three-dimensional coordinates of the calibration points on the mobile phone side reference plate by combining with the camera parameter analysis of the optical positioner;
s408, analyzing to obtain pose data of the mobile phone side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the mobile phone side reference plate;
s409, based on pose data of a mobile phone side reference plate, selecting a specific calibration identification point on the mobile phone side reference plate as an origin and a specific direction as a coordinate axis, and establishing a mobile phone side coordinate system;
s410, analyzing to obtain a matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system.
5. The AR navigation data processing method according to claim 1, wherein the step S8 specifically includes:
s801, obtaining point position data deviation by comparing actual planting point position data with target planting point position data until the point position data deviation is zero;
s802, obtaining angle data deviation by comparing actual planting angle data with target planting angle data until the angle data deviation is zero;
s803, obtaining depth data deviation by comparing the actual planting depth data with the target planting depth data until the depth data deviation is zero.
6. An AR navigation data processing system, comprising:
the communication connection module is used for establishing communication data connection between the data processing terminal and the AR display terminal, and the AR display terminal is arranged on the spectacle lens;
the scheme acquisition module is used for acquiring preset target planting scheme data, wherein the target planting scheme data comprises target planting point position data, target planting angle data and target planting depth data;
the registration module is used for registering the CT, the patient side reference plate and the optical positioning instrument and acquiring a matrix conversion relation among a CT coordinate system, a patient side coordinate system and the optical positioning instrument coordinate system;
The calibration module is used for calibrating the patient side reference plate and the mobile phone side reference plate to acquire a matrix conversion relation between a patient side coordinate system and a mobile phone side coordinate system;
the matrix operation module is used for establishing a matrix conversion relation among any two of a CT coordinate system, an optical positioning instrument coordinate system, a patient side coordinate system and an optical positioning instrument coordinate system;
the data acquisition module is used for acquiring real-time infrared light imaging data of the patient side reference plate and the mobile phone side reference plate, acquiring real-time pose data of the patient by combining the real-time infrared light imaging data of the patient side reference plate and acquiring real-time pose data of the mobile phone by combining the real-time infrared light imaging data of the mobile phone side reference plate according to the matrix conversion relation among the CT coordinate system, the optical positioning instrument coordinate system, the patient side coordinate system and the optical positioning instrument coordinate system;
the data analysis module is used for analyzing the real-time pose data of the patient and the real-time pose data of the mobile phone to obtain actual planting point position data, actual planting angle data and actual planting depth data;
the comparison module is used for sequentially comparing the actual planting point position data with the target planting point position data to obtain point position data deviation, comparing the actual planting angle data with the target planting angle data to obtain angle data deviation, and comparing the actual planting depth data with the target planting depth data to obtain depth data deviation;
The data transmission module is used for transmitting the point position data deviation, the angle data deviation and the depth data deviation to the AR display terminal, and displaying the point position navigation parameters, the angle navigation parameters and the depth navigation parameters through the AR display terminal.
7. The AR navigation data processing system of claim 6, wherein the communication connection module comprises:
the network access unit is used for respectively controlling the data processing terminal and the AR display terminal to access the same local network;
the IP acquisition unit is used for acquiring the IP address information of the AR display terminal;
the communication connection unit is used for sending the IP address information of the AR display terminal to the data processing terminal, and controlling the data processing terminal to establish communication data connection with the AR display terminal through the IP address information of the AR display terminal;
the delay address setting unit is used for setting a delay test IP address, setting the IP address information of the AR display terminal as a target IP address at the data processing terminal, and setting the IP address information of the data processing terminal as the target IP address at the AR display terminal;
the delay data transmitting unit is used for transmitting the delay test data packet, controlling the data processing terminal to transmit the delay test data packet to the AR display terminal, and controlling the AR display terminal to transmit the delay test data packet to the data processing terminal;
The delay data receiving unit is used for receiving the delay test data packet, controlling the data processing terminal to receive the delay test data packet from the AR display terminal, and controlling the AR display terminal to receive the delay test data packet from the data processing terminal;
the delay calculating unit is used for calculating network delay through the delay test data packet sending time and the delay test data packet receiving time, and judging whether the network transmission speed is qualified or not based on a preset network delay threshold value.
8. The AR navigation data processing system of claim 6, wherein the registration module comprises:
a first coordinate value acquisition unit for acquiring coordinate values [ X1, Y1, Z1] of the registration point on the patient-side reference plate in the three-dimensional model by the design model;
a second coordinate value acquisition unit for acquiring coordinate values [ X2, Y2, Z2] of the registration point on the patient side reference plate in the CT image by CT scanning;
the third coordinate value acquisition unit is used for receiving infrared light emitted by the patient side reference plate through the optical positioner to acquire coordinate values [ X3, Y3, Z3] of the alignment point on the patient side reference plate in the optical positioning image;
the first matrix operation unit is used for aligning the three-dimensional model with the same registration point in the CT image, and calculating to obtain a matrix conversion relation between a patient side coordinate system and a CT coordinate system based on coordinate values [ X1, Y1, Z1] of the registration point in the three-dimensional model and coordinate values [ X2, Y2, Z2] in the CT image;
And the second matrix operation unit is used for aligning the same registration point in the optical positioning image and the CT image, and calculating to obtain a matrix conversion relation between the CT coordinate system and the coordinate system of the optical positioning instrument based on the coordinate values [ X2, Y2, Z2] of the registration point in the CT image and the coordinate values [ X3, Y3, Z3] in the optical positioning image.
9. The AR navigation data processing system of claim 6, wherein the calibration module comprises:
the pose adjusting unit is used for controlling the patient side reference plate and the mobile phone side reference plate to be in different pose states respectively;
the plane coordinate analysis unit is used for acquiring image data of the patient side reference plate in each pose state and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the patient side reference plate;
the space coordinate analysis unit is used for matching the image data of the patient side reference plate in a plurality of pose states based on the calibration points on the patient side reference plate, and combining the camera parameter analysis of the optical positioning instrument to obtain the space three-dimensional coordinates of the calibration points on the patient side reference plate;
the first pose analysis unit is used for analyzing and obtaining pose data of the patient side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the patient side reference plate;
The first coordinate system establishing unit is used for establishing a patient side coordinate system by selecting a specific calibration identification point on the patient side reference plate as an origin and selecting a specific direction as a coordinate axis based on pose data of the patient side reference plate;
the real-time two-dimensional coordinate acquisition unit is used for acquiring image data of the mobile phone side reference plate in each pose state and analyzing to obtain two-dimensional coordinates of an image of a calibration point on the mobile phone side reference plate;
the real-time three-dimensional coordinate acquisition unit is used for matching the image data of the mobile phone side reference plate in a plurality of pose states based on the calibration points on the mobile phone side reference plate, and obtaining the spatial three-dimensional coordinates of the calibration points on the mobile phone side reference plate by combining the camera parameter analysis of the optical positioning instrument;
the second pose analysis unit is used for analyzing pose data of the mobile phone side reference plate according to the two-dimensional coordinates and the three-dimensional coordinates of the image of the marked point on the mobile phone side reference plate;
the second coordinate system establishing unit is used for establishing a mobile phone side coordinate system by selecting a specific calibration identification point on the mobile phone side reference plate as an origin and selecting a specific direction as a coordinate axis based on pose data of the mobile phone side reference plate;
and the calibration unit is used for analyzing and obtaining a matrix conversion relation between the patient side coordinate system and the mobile phone side coordinate system.
10. A storage medium storing a computer program comprising program instructions which, when executed by a processor, perform the AR navigation data processing method of any one of claims 1-5.
CN202311333906.0A 2023-10-16 2023-10-16 AR navigation data processing method, system and storage medium Active CN117115401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311333906.0A CN117115401B (en) 2023-10-16 2023-10-16 AR navigation data processing method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311333906.0A CN117115401B (en) 2023-10-16 2023-10-16 AR navigation data processing method, system and storage medium

Publications (2)

Publication Number Publication Date
CN117115401A true CN117115401A (en) 2023-11-24
CN117115401B CN117115401B (en) 2024-02-06

Family

ID=88813085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311333906.0A Active CN117115401B (en) 2023-10-16 2023-10-16 AR navigation data processing method, system and storage medium

Country Status (1)

Country Link
CN (1) CN117115401B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010259497A (en) * 2009-04-30 2010-11-18 Osaka Univ Surgery navigation system using retina projection type head-mounted display device and method of superimposing simulation images
JP2015100437A (en) * 2013-11-22 2015-06-04 京セラメディカル株式会社 Navigation system for surgical operation
US20200321099A1 (en) * 2019-04-04 2020-10-08 Centerline Biomedical, Inc. Registration of Spatial Tracking System with Augmented Reality Display
CN114711962A (en) * 2022-04-18 2022-07-08 北京恩维世医疗科技有限公司 Augmented reality operation planning navigation system and method
US20220370147A1 (en) * 2021-05-21 2022-11-24 Stryker European Operations Limited Technique of Providing User Guidance For Obtaining A Registration Between Patient Image Data And A Surgical Tracking System
WO2023014667A1 (en) * 2021-08-02 2023-02-09 Hes Ip Holdings, Llc Augmented reality system for real space navigation and surgical system using the same
CN115824250A (en) * 2022-10-14 2023-03-21 大连海事大学 Intelligent ship augmented reality navigation information display method
CN116747039A (en) * 2023-08-17 2023-09-15 深圳卡尔文科技有限公司 Planting robot pose adjustment method, system and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010259497A (en) * 2009-04-30 2010-11-18 Osaka Univ Surgery navigation system using retina projection type head-mounted display device and method of superimposing simulation images
JP2015100437A (en) * 2013-11-22 2015-06-04 京セラメディカル株式会社 Navigation system for surgical operation
US20200321099A1 (en) * 2019-04-04 2020-10-08 Centerline Biomedical, Inc. Registration of Spatial Tracking System with Augmented Reality Display
US20220370147A1 (en) * 2021-05-21 2022-11-24 Stryker European Operations Limited Technique of Providing User Guidance For Obtaining A Registration Between Patient Image Data And A Surgical Tracking System
WO2023014667A1 (en) * 2021-08-02 2023-02-09 Hes Ip Holdings, Llc Augmented reality system for real space navigation and surgical system using the same
CN114711962A (en) * 2022-04-18 2022-07-08 北京恩维世医疗科技有限公司 Augmented reality operation planning navigation system and method
CN115824250A (en) * 2022-10-14 2023-03-21 大连海事大学 Intelligent ship augmented reality navigation information display method
CN116747039A (en) * 2023-08-17 2023-09-15 深圳卡尔文科技有限公司 Planting robot pose adjustment method, system and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
海思穹;林艳萍;王成焘;闫焱;: "手术导航系统中的数据通信与实时跟踪方法研究", 生物医学工程学杂志, no. 03 *

Also Published As

Publication number Publication date
CN117115401B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN100562285C (en) A kind of digitized synoptophore
KR102039281B1 (en) Oral Endoscopy System and Test Method
US11839437B2 (en) Surgical instrument mounted display system
US20160000514A1 (en) Surgical vision and sensor system
CN110324535A (en) A kind of fixed focal length camera focus adjuster and method of adjustment
WO2020024638A1 (en) Surgical navigation device
JP7242646B2 (en) surgical instrument mounted display system
TWI825891B (en) Augmented reality system for real space navigation and surgical system using the same
EP4274506A1 (en) Registration degradation correction for surgical navigation procedures
EP3984016A1 (en) Systems and methods for superimposing virtual image on real-time image
CN117115401B (en) AR navigation data processing method, system and storage medium
KR20100058031A (en) The system for capturing 2d facial image
Cutolo et al. The role of camera convergence in stereoscopic video see-through augmented reality displays
CN201223382Y (en) Synoptophore
CN113040909A (en) Optical tracking system and method based on near-infrared three-eye stereo vision
US20230310098A1 (en) Surgery assistance device
CN216561196U (en) Dental operation microscope
CN113242710A (en) Method and apparatus for determining a refractive characteristic of an eye of a subject
CN113925441B (en) Imaging method and imaging system based on endoscope
JPWO2019050049A1 (en) Ophthalmic equipment, management methods, and management equipment
CN110638524B (en) Tumor puncture real-time simulation system based on VR glasses
KR20040097477A (en) Head mounted computer interfacing device and method using eye-gaze direction
CN209059467U (en) A kind of operation navigation device
CN113081273A (en) Punching auxiliary system and surgical robot system
CN117379179B (en) External physical point navigation registration method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant