CN109498156A - A kind of head operation air navigation aid based on 3-D scanning - Google Patents
A kind of head operation air navigation aid based on 3-D scanning Download PDFInfo
- Publication number
- CN109498156A CN109498156A CN201710824895.4A CN201710824895A CN109498156A CN 109498156 A CN109498156 A CN 109498156A CN 201710824895 A CN201710824895 A CN 201710824895A CN 109498156 A CN109498156 A CN 109498156A
- Authority
- CN
- China
- Prior art keywords
- head
- reference frame
- space
- model
- dimensional scanning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012937 correction Methods 0.000 claims abstract description 31
- 238000000034 method Methods 0.000 claims abstract description 28
- 230000000007 visual effect Effects 0.000 claims abstract description 10
- 238000002591 computed tomography Methods 0.000 claims abstract description 6
- 238000001356 surgical procedure Methods 0.000 claims description 18
- 239000011159 matrix material Substances 0.000 claims description 13
- 239000003550 marker Substances 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 6
- 238000013519 translation Methods 0.000 claims description 2
- 238000009434 installation Methods 0.000 abstract description 3
- 210000003128 head Anatomy 0.000 description 29
- 210000005069 ears Anatomy 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The present invention relates to medical domains, more particularly to disclose a kind of head operation air navigation aid based on 3-D scanning.Method includes the following steps: 1) carry out preoperative CT scan to patients head;2) installation correction frame of reference, reference frame and stereoscopic vision equipment;3) patients head, reference frame, all threedimensional models for correcting frame of reference and fit characteristic point are obtained using three-dimensional scanning device;4) space coordinate of reference frame, all correction frames of reference is obtained using stereoscopic vision;5) visual space error is corrected by error correction model;6) head model of patients head CT model and three-dimensional scanning device building is subjected to point cloud registering;7) stereoscopic vision obtains object reference rack coordinate system in real time, position of the real-time display instrument in the space CT in navigation software.This method can avoid deviation of navigating as caused by the marking bias in preoperative and art, improve the precision of surgical navigational.
Description
Technical Field
The invention relates to the field of medicine, and particularly discloses a head surgery navigation method based on three-dimensional scanning.
Background
Computer Assisted Surgery (CAS) is an emerging technology that utilizes computer high speed information processing capabilities, combines advanced imaging devices (CT, MRI, etc.) and spatial localization methods, provides navigational support to surgeons, makes the surgical procedure safer and more accurate, has better surgical effect, and shortens the rehabilitation process. The CAS has a basic principle of measuring the position of a surgical instrument relative to an operation target in real time using an external tracking device and then displaying the position information on a medical image or map so that a surgeon can clearly see the current position of the surgical instrument to facilitate judgment and decision of a surgical operation.
The CT navigation is an operation navigation method which is the earliest developed and the most mature technology, and the typical process is to obtain a CT scanning image of an operation area before an operation, establish the relation between the actual anatomical structure of a patient and the preoperative CT image in the operation and provide rich two-dimensional and three-dimensional navigation information for a doctor to plan and operate. The core problem of CT navigation is how to establish coordinate transformation between the CT image and the actual object, a process called registration.
In the existing CT navigation system, the registration method is divided into two categories. One is to use the mark points developed under CT; the second is to use the point cloud of the human tissue characteristic region. The first method requires more preparatory work before the operation, and the second method is more complicated to operate during the operation.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a head surgery navigation method based on three-dimensional scanning, which is a non-invasive high-precision positioning navigation method, and uses a three-dimensional scanning device to obtain a head skin surface point cloud for CT registration by a non-contact method, and corrects a spatial model relationship constructed by stereoscopic vision by using a spatial model relationship constructed by the three-dimensional scanning device.
In order to achieve the above purpose, the head surgery navigation method based on three-dimensional scanning provided by the invention comprises the following steps:
s1, performing preoperative CT scanning on the head of a patient;
s2, after the surgical body position of the patient is placed, fixedly mounting one or more correction reference frames near the head of the patient, mounting a reference frame on the head of the patient, mounting stereoscopic vision equipment, and mounting a target reference frame in a surgical instrument;
s3, acquiring a three-dimensional model of the head of the patient, a reference frame and all correction reference frames by adopting three-dimensional scanning equipment, fitting characteristic points of the reference frame and all correction reference frames in the three-dimensional model, constructing a spatial relationship among the head of the patient, all correction reference frames and the reference frame in a three-dimensional model coordinate system, and inputting a model relationship matrix into navigation software;
s4, acquiring spatial coordinates of the reference frame and all correction reference frames by adopting stereoscopic vision equipment, and establishing spatial relations between all correction reference frames and the reference frame in a visual coordinate system;
s5, correcting the visual space error through the error correction model, and constructing a navigation space under stereoscopic vision;
s6, performing point cloud registration on the patient head CT model and a head model constructed by three-dimensional scanning equipment, and unifying the spatial relationship between the patient head CT model and a reference frame under a visual space coordinate system;
and S7, the stereoscopic vision equipment acquires the space coordinates of the target reference frame and the reference frame in real time, and displays the position of the instrument in the CT space in real time in the navigation software.
According to the invention, said calibration frame can be removed after completion of step 4.
According to the invention, the reference frame is maintained in a relatively fixed positional relationship with the head of the patient and is located at a position that does not interfere with the surgical procedure.
According to the invention, the reference frame, the correction reference frame and the target reference frame have three-dimensional characteristics including but not limited to angular points, spherical surfaces and cones, and all have marks which can be identified by stereoscopic vision.
According to the invention, the three-dimensional scanning equipment can scan the space outline of the object and acquire the space coordinates of the surface of the object.
According to the invention, the stereoscopic vision device acquires three-dimensional geometric information of the object based on the parallax principle.
According to the invention, in the step 2, the reference frame and the correction frame are within the space range of the stereoscopic vision equipment.
According to the invention, the step 6 takes the skin in the CT data and the three-dimensionally scanned skin as registration markers.
According to the present invention, the error correction model adopted in step 5 is as follows:
the coordinate matrix of the reference frame is MoThe coordinate matrix of the calibration reference frame is Ml,...,MnAnd the coordinates of the corresponding marker ball center in the respective reference frame areThe coordinates of all marker balls in the reference frame space are then
In the three-dimensional model, each marker sphere is composed of a point cloud { Q) distributed over the sphere surfacei=(ai,bi,ci) Form (i ═ 1.., N)
Model space { QiTo the reference space PiThe conversion matrix T of
Using the translation and rotation components to represent M as a six parameter set (t)x,ty,tz,rx,ry,rz) I.e. by
Wherein,
the transformation matrix M is the transformation matrix M that minimizes the overall error between two sets of points, i.e., σ2=∑||Pi-MQi||2And minimum.
Compared with the prior art, the invention has the beneficial effects that:
the head operation navigation method based on three-dimensional scanning provided by the invention does not need to add marks on a patient before an operation, is convenient for CT scanning before the operation, avoids navigation deviation caused by mark offset between the operation and the operation, and can improve the accuracy of operation navigation by a space correction method.
Drawings
FIG. 1 is a schematic diagram of a system structure of a head surgery navigation method based on three-dimensional scanning
201-reference frame, 202-calibration frame, 203-optical point location system, 204-three-dimensional scanning device
Detailed Description
The following describes embodiments of the present invention with reference to the drawings.
Referring to fig. 1, one embodiment of the head surgery navigation method based on three-dimensional scanning of the present invention comprises the following steps:
s1, performing preoperative CT scanning on the head of a patient;
s2, after the surgical body position of the patient is placed, fixedly mounting one or more correction reference frames 202 near the head of the patient, mounting a reference frame 201 on the head of the patient, mounting stereoscopic vision equipment 203, and mounting a target reference frame 205 in a surgical instrument;
s3, acquiring a three-dimensional model of the head of the patient, a reference frame and all correction reference frames by using three-dimensional scanning equipment 204, fitting characteristic points of the reference frame and all correction reference frames in the three-dimensional model, constructing a spatial relationship of the head of the patient, all correction reference frames 202 and a reference frame 201 in a three-dimensional model coordinate system, and inputting a model relationship matrix into navigation software;
s4, acquiring space coordinates of the reference frame 201 and all correction reference frames 202 by adopting stereoscopic vision equipment 203, and establishing space relations of all correction reference frames 202 and the reference frame 201 in a visual coordinate system;
s5, correcting the visual space error through the error correction model, and constructing a navigation space under stereoscopic vision;
s6, performing point cloud registration on the patient head CT model and the head model constructed by the three-dimensional scanning equipment 204, and unifying the spatial relationship between the patient head CT model and the reference frame under a visual space coordinate system;
and S7, the stereoscopic vision equipment 203 acquires the space coordinates of the target reference frame and the reference frame in real time, and displays the position of the instrument in the CT space in real time in the navigation software.
The reference frame 201 is a reference frame for the whole surgical navigation, and should always keep a relatively fixed position relation with the head of the patient and be positioned at a position which does not affect the surgical operation. Possible mounting positions of the reference frame 201 include: operating tables and head frames.
The calibration frame 202 should be as close to the surgical field as possible. In a preferred embodiment, the reference frame is tied to the forehead or temporal side by using a strap; in another preferred embodiment, the reference frame is adhered to the skin surface using medical tape. The selection of the installation position of the calibration reference frame 202 needs to satisfy the following requirements: the distinctive skin areas, such as the nose, lips, ears, should not be deformed.
The stereoscopic vision device is used for acquiring three-dimensional geometric information of an object based on the parallax principle, and NDIPolaris or other optical devices can be adopted.
Step 6, the skin in the CT data and the three-dimensionally scanned skin are used as registration markers, and the registration algorithm may be an Iterative Closest Point (ICP) algorithm.
The three-dimensional scanning device can scan the space outline of an object and acquire the space coordinates of the surface of the object, and the three-dimensional scanning device comprises but is not limited to a structured light scanner and a three-dimensional laser scanner.
The reference frame is kept in a relatively fixed position relation with the head of the patient and is positioned at a position which does not influence the operation, such as an operating bed, a head frame and the like.
The calibration frame 202 should be as close to the surgical field as possible. In a preferred embodiment, the reference frame is tied to the forehead or temporal side by using a strap; in another preferred embodiment, the reference frame is adhered to the skin surface using medical tape. The selection of the installation position of the calibration reference frame 202 needs to satisfy the following requirements: the distinctive skin areas, such as the nose, lips, ears, should not be deformed.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (8)
1. A head surgery navigation method based on three-dimensional scanning is characterized by comprising the following steps:
s1, performing preoperative CT scanning on the head of a patient;
s2, after the surgical body position of the patient is placed, fixedly mounting one or more correction reference frames near the head of the patient, mounting a reference frame on the head of the patient, mounting stereoscopic vision equipment, and mounting a target reference frame in a surgical instrument;
s3, acquiring a three-dimensional model of the head of the patient, a reference frame and all correction reference frames by adopting three-dimensional scanning equipment, fitting characteristic points of the reference frame and all correction reference frames in the three-dimensional model, constructing a spatial relationship among the head of the patient, all correction reference frames and the reference frame in a three-dimensional model coordinate system, and inputting a model relationship matrix into navigation software;
s4, acquiring spatial coordinates of the reference frame and all correction reference frames by adopting stereoscopic vision equipment, and establishing spatial relations between all correction reference frames and the reference frame in a visual coordinate system;
s5, correcting the visual space error through the error correction model, and constructing a navigation space under stereoscopic vision;
s6, performing point cloud registration on the patient head CT model and a head model constructed by three-dimensional scanning equipment, and unifying the spatial relationship between the patient head CT model and a reference frame under a visual space coordinate system;
and S7, the stereoscopic vision equipment acquires the space coordinates of the target reference frame and the reference frame in real time, and displays the position of the instrument in the CT space in real time in the navigation software.
2. The method for navigating the head surgery based on the three-dimensional scanning according to the claim 1 is characterized in that the error correction model adopted in the step 5 is as follows:
the coordinate matrix of the reference frame is M0The coordinate matrix of the calibration reference frame is M1,...,MnAnd the coordinates of the corresponding marker ball center in the respective reference frame areThe coordinates of all marker balls in the reference frame space are thenIn the three-dimensional model, each marker sphere is composed of a point cloud { Q) distributed over the sphere surfacei=(ai,bi,ci) 1, · N); model space { QiTo the reference space PiA transformation matrix T of { C }; using translation, rotation subscalesM is shown as a six parameter set (t)x,ty,tz,rx,ry,rz) I.e. by
Wherein,
the transformation matrix M is the transformation matrix M that minimizes the overall error between two sets of points, i.e., σ2=∑||Pi-MQi||2And minimum.
3. The head surgery navigation method based on three-dimensional scanning according to claim 1,
said calibration frame can be removed after completion of step 4;
the fiducial reference frame is maintained in a relatively fixed positional relationship with the patient's head and is positioned at a location that does not interfere with the surgical procedure.
4. The head surgery navigation method based on three-dimensional scanning according to claim 1, characterized in that the reference frame, the correction frame and the target frame are:
having a three-dimensional feature;
there are marks that can be recognized by stereoscopic vision.
5. The method for navigating the head surgery based on the three-dimensional scanning according to the claim 1 is characterized in that the three-dimensional scanning device can scan the space outline of the object to obtain the space coordinates of the surface of the object.
6. The method for navigating the head surgery based on the three-dimensional scanning according to the claim 1 is characterized in that the stereoscopic vision device obtains the three-dimensional geometrical information of the object based on the parallax principle.
7. The method for navigating the head surgery based on the three-dimensional scanning according to the claim 1 is characterized in that in the step 2, the reference frame and the correction frame are within the space range of the stereovision device.
8. The method for navigating the head surgery based on the three-dimensional scanning according to the claim 1 is characterized in that the step 6 takes the skin in the CT data and the skin of the three-dimensional scanning as the registration mark.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710824895.4A CN109498156A (en) | 2017-09-14 | 2017-09-14 | A kind of head operation air navigation aid based on 3-D scanning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710824895.4A CN109498156A (en) | 2017-09-14 | 2017-09-14 | A kind of head operation air navigation aid based on 3-D scanning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109498156A true CN109498156A (en) | 2019-03-22 |
Family
ID=65744317
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710824895.4A Pending CN109498156A (en) | 2017-09-14 | 2017-09-14 | A kind of head operation air navigation aid based on 3-D scanning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109498156A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310328A (en) * | 2019-07-22 | 2019-10-08 | 雅客智慧(北京)科技有限公司 | Mixed reality operates method for registering and device |
CN110680470A (en) * | 2019-10-08 | 2020-01-14 | 山东大学 | Laser guide positioning device and method of automatic tumor puncture machine |
CN111658065A (en) * | 2020-05-12 | 2020-09-15 | 北京航空航天大学 | Digital guide system for mandible cutting operation |
CN112618014A (en) * | 2020-12-14 | 2021-04-09 | 吴頔 | Non-contact intracranial puncture positioning navigation |
CN113143463A (en) * | 2021-03-16 | 2021-07-23 | 上海交通大学 | Operation navigation device, system, calibration method, medium and electronic equipment |
CN113262048A (en) * | 2021-04-25 | 2021-08-17 | 深影医疗科技(深圳)有限公司 | Spatial registration method and device, terminal equipment and intraoperative navigation system |
CN113349931A (en) * | 2021-06-18 | 2021-09-07 | 云南微乐数字医疗科技有限公司 | Focus registration method of high-precision surgical navigation system |
CN114129265A (en) * | 2021-12-24 | 2022-03-04 | 南方医科大学深圳医院 | Orthopedic surgery navigation method with multiple reference frames |
CN114463482A (en) * | 2020-11-09 | 2022-05-10 | 北京理工大学 | Calibration model and method of optical tracking three-dimensional scanner and surgical navigation system thereof |
CN116628786A (en) * | 2023-07-26 | 2023-08-22 | 中南大学 | Manufacturing method of special-shaped three-dimensional marking ball |
-
2017
- 2017-09-14 CN CN201710824895.4A patent/CN109498156A/en active Pending
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310328B (en) * | 2019-07-22 | 2021-04-30 | 雅客智慧(北京)科技有限公司 | Mixed reality operation registration method and device |
CN110310328A (en) * | 2019-07-22 | 2019-10-08 | 雅客智慧(北京)科技有限公司 | Mixed reality operates method for registering and device |
CN110680470A (en) * | 2019-10-08 | 2020-01-14 | 山东大学 | Laser guide positioning device and method of automatic tumor puncture machine |
CN110680470B (en) * | 2019-10-08 | 2021-04-23 | 山东大学 | Laser guide positioning device of automatic tumor puncture machine |
CN111658065A (en) * | 2020-05-12 | 2020-09-15 | 北京航空航天大学 | Digital guide system for mandible cutting operation |
CN114463482A (en) * | 2020-11-09 | 2022-05-10 | 北京理工大学 | Calibration model and method of optical tracking three-dimensional scanner and surgical navigation system thereof |
CN112618014A (en) * | 2020-12-14 | 2021-04-09 | 吴頔 | Non-contact intracranial puncture positioning navigation |
CN113143463A (en) * | 2021-03-16 | 2021-07-23 | 上海交通大学 | Operation navigation device, system, calibration method, medium and electronic equipment |
CN113262048A (en) * | 2021-04-25 | 2021-08-17 | 深影医疗科技(深圳)有限公司 | Spatial registration method and device, terminal equipment and intraoperative navigation system |
CN113262048B (en) * | 2021-04-25 | 2022-06-24 | 深影医疗科技(深圳)有限公司 | Spatial registration method and device, terminal equipment and intraoperative navigation system |
CN113349931A (en) * | 2021-06-18 | 2021-09-07 | 云南微乐数字医疗科技有限公司 | Focus registration method of high-precision surgical navigation system |
CN113349931B (en) * | 2021-06-18 | 2024-06-04 | 云南微乐数字医疗科技有限公司 | Focus registration method for high-precision operation navigation system |
CN114129265A (en) * | 2021-12-24 | 2022-03-04 | 南方医科大学深圳医院 | Orthopedic surgery navigation method with multiple reference frames |
CN116628786A (en) * | 2023-07-26 | 2023-08-22 | 中南大学 | Manufacturing method of special-shaped three-dimensional marking ball |
CN116628786B (en) * | 2023-07-26 | 2023-10-10 | 中南大学 | Manufacturing method of special-shaped three-dimensional marking ball |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109498156A (en) | A kind of head operation air navigation aid based on 3-D scanning | |
EP3254621B1 (en) | 3d image special calibrator, surgical localizing system and method | |
CN109464196B (en) | Surgical navigation system adopting structured light image registration and registration signal acquisition method | |
US11944390B2 (en) | Systems and methods for performing intraoperative guidance | |
JP5702861B2 (en) | Assisted automatic data collection method for anatomical surfaces | |
JP7202737B2 (en) | Tracking method and apparatus for dental implant navigation surgery | |
WO2017185540A1 (en) | Neurosurgical robot navigation positioning system and method | |
US9241657B2 (en) | Medical image registration using a rigid inner body surface | |
US11904182B2 (en) | Research and development of augmented reality in radiotherapy | |
CN101108140B (en) | Calibration mould used for image navigation operation system and calibration method thereof | |
WO2021217713A1 (en) | Surgical navigation system, computer for performing surgical navigation method, and storage medium | |
US20160000518A1 (en) | Tracking apparatus for tracking an object with respect to a body | |
CN110236674A (en) | A kind of operation on liver navigation methods and systems based on structure light scan | |
CN107049489B (en) | A kind of operation piloting method and system | |
CN103948432A (en) | Algorithm for augmented reality of three-dimensional endoscopic video and ultrasound image during operation | |
JP2002186603A (en) | Method for transforming coordinates to guide an object | |
CN105931237A (en) | Image calibration method and system | |
Zeng et al. | A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation | |
WO2021069449A1 (en) | System and method for computation of coordinate system transformations | |
CN110537985A (en) | Spine space coordinate system positioning device and method for augmented reality surgery system | |
Wang et al. | Real-time marker-free patient registration and image-based navigation using stereovision for dental surgery | |
CN213098281U (en) | Surgical navigation system | |
CN112107366B (en) | Mixed reality ultrasonic navigation system | |
CN111956327B (en) | Image measurement and registration method | |
JP2022018985A (en) | Image measurement and registration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20190322 |
|
WD01 | Invention patent application deemed withdrawn after publication |