CN111167020A - Tumor internal irradiation transplanting method and optical guiding device thereof - Google Patents
Tumor internal irradiation transplanting method and optical guiding device thereof Download PDFInfo
- Publication number
- CN111167020A CN111167020A CN201911417061.7A CN201911417061A CN111167020A CN 111167020 A CN111167020 A CN 111167020A CN 201911417061 A CN201911417061 A CN 201911417061A CN 111167020 A CN111167020 A CN 111167020A
- Authority
- CN
- China
- Prior art keywords
- needle
- light
- human body
- tumor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1001—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
- A61N5/1007—Arrangements or means for the introduction of sources into the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1001—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
- A61N5/1007—Arrangements or means for the introduction of sources into the body
- A61N2005/1012—Templates or grids for guiding the introduction of sources
Abstract
The embodiment of the invention discloses a tumor internal irradiation transplanting method and an optical guiding device thereof, wherein the method comprises the following steps: 1. constructing a human body 3D image to display the outer contour, the tumor and the surrounding organs of the patient; 2. setting the insertion position, the insertion angle and the distribution of the insertion needles, and displaying the virtual interpolation needles on the human body 3D image; 3. detecting the real position of the insertion needle, and displaying the real insertion needle, the tumor and surrounding organs on a human body 3D image so as to display a real-time image of the spatial position relationship between the insertion needle and the tumor; 4. the doctor operates the insertion needle according to experience, the insertion needle is inserted into the tumor according to the virtual insertion position on the human body 3D image and the real-time image of the spatial position relation between the insertion needle and the tumor, the real-time position of the insertion needle is displayed in the human body 3D image of the tumor, the normal organ and the human body outer contour of the patient, and the guidance is provided for the doctor to operate the insertion needle, so that the accuracy of the interpolation process is improved.
Description
Technical Field
The embodiment of the invention relates to the technical field of internal irradiation treatment, in particular to an internal irradiation implantation method for tumors and an optical guiding device thereof.
Background
The internal radiation therapy is a method of radiation therapy by placing a small radioactive source inside a human body, irradiating the radioactive source into or close to the surface of a tumor body, and irradiating with radioactive rays. It is basically characterized by that the radioactive source can be maximally close to tumor tissue to make continuous irradiation so as to obtain effective killing dose for tumor tissue, and the normal tissue receiving quantity of its periphery is low.
Internal radiation therapy is commonly used in areas with natural orifices, such as cervical cancer, esophageal cancer, etc. In the internal irradiation therapy, the key to ensure the treatment effect is to place the radioactive source at an accurate position. A reasonable spatial distribution of the radioactive source is a prerequisite for ensuring that the tumor is irradiated with radiation.
The current method of application is to insert a hollow insertion needle into the tumor, along which the radiation source is then introduced into the interior of the tumor. Usually, 4 to 6 or more insertion needles are required to be inserted into one tumor. The angle and depth of the insertion needle determine the spatial distribution of the radioactive source in the tumor, so the angle and depth of the insertion needle and the spatial relationship among a plurality of insertion needles determine the tumor treatment effect.
The current method of insertion is blind insertion by a doctor. That is, the doctor judges the size, shape and relation with surrounding normal organs of the tumor through CT images, then holds the insertion needle by hand to insert the insertion needle into the tumor through vagina or other natural pipelines, and the experience of the doctor is relied on, so that the improper or incorrect interpolation position is easy to occur, the adjustment and the re-insertion are needed, and the damage to the patient is increased.
Disclosure of Invention
Therefore, the embodiment of the invention provides an intratumoral irradiation implantation method and an optical guiding device thereof, which aim to solve the problems that in the prior art, the experience of doctors is relied, the interpolation position is improper or incorrect, the adjustment and re-implantation are needed, and the damage to patients is increased.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
according to a first aspect of embodiments of the present invention, a method for intratumoral irradiation implantation comprises the steps of:
s100, constructing a human body 3D image to display the outer contour, the tumor and the surrounding organs of the patient;
s200, setting the insertion position, the insertion angle and the distribution of the insertion needle, and displaying a virtual interpolation needle on the human body 3D image;
s300, detecting the real position of the inserting needle, and displaying the real inserting needle, the tumor and surrounding organs on the human body 3D image so as to display a real-time image of the spatial position relation between the inserting needle and the tumor;
s400, the doctor operates the inserting needle according to experience, and inserts the inserting needle into the tumor according to the virtual inserting position on the 3D image of the human body and the real-time image of the spatial position relation between the inserting needle and the tumor.
Further, the specific construction steps of the human body 3D image are as follows:
s101, scanning a patient by using a CT (computed tomography), uploading a CT image to an upper computer, and reconstructing images of the outline of the human body, the tumor and normal organs around the tumor in the front-back direction of the human body, the left-right direction of the human body and the up-down direction of the human body in the upper computer;
s102, shooting a patient by using a camera, wherein the shooting comprises the front and back direction of a human body, the left and right direction of the human body and the up and down direction of the human body, and uploading a shooting result to the upper computer;
s103, integrating the scanning result and the shooting result by the upper computer to construct a 3D image of the tumor, the normal organ and the human body outline of the patient, and displaying the 3D image on a display screen.
Further, the specific process of the insertion needle position detection is as follows:
s301, arranging a light reflecting point on the inserting needle;
s302, inserting the insertion needle into a patient;
s303, externally arranging a light source, wherein the light source emits light;
s304, reflecting light by the light reflecting points on the inserting needle;
s305, detecting light reflected by a light reflecting point on the insertion needle, and determining the spatial position and the angle of the insertion needle in the patient body based on the externally detected light.
Further, the light source is an LED or an LD, the light is infrared light or near-infrared light, the wavelength of the light is light wave in the range of 605nm to 1500nm, and at least two reflecting points are provided.
Further, the detection process of the light is as follows: the two cameras are arranged on the outer portion of the device and form a certain angle, light emitted by the light source is collected by the cameras after being reflected by the reflection points, the cameras are connected with the filters, and the reflected light is collected by the cameras after being filtered.
Furthermore, the reflected light information detected by the camera is uploaded to the upper computer, and the spatial position and the angle of the inserting needle are reconstructed by the upper computer
According to a second aspect of the embodiments of the present invention, an optical guiding device for intratumoral irradiation implantation comprises an implantation needle and an external detection device, wherein the implantation needle comprises a needle head for inserting into a tumor and a needle tail left outside the tumor, two light reflecting points are arranged on the needle tail, one of the light reflecting points is arranged at the junction of the needle tail and the needle head, and the other light reflecting point is arranged at the tail of the needle tail;
the external detection device comprises a light source and at least two night vision cameras, the light source emits light which is captured and collected by the night vision cameras after being reflected by the reflection points, the external detection device is connected with an imaging device, the imaging device is used for receiving light information measured by the external detection device, and the imaging device displays a visual image of the positions of the reflection points on the interpolation needle in the patient.
Further, the light source is an LED or LD, and the light reflected by the light source is infrared light or near-infrared light and has a wavelength in the range of 650nm to 1500 nm.
Furthermore, the night vision camera is connected with a filter, the filter is a narrow-band interference filter, and reflected light firstly passes through the filter and then is collected by the night vision camera.
According to the invention, the human body 3D image of the tumor, the normal organ and the human body outline of the patient is constructed and displayed, and the real-time position of the insertion needle is also displayed in the human body 3D image, so that guidance is provided for a doctor to operate the insertion needle, the accuracy of the interpolation process is improved, re-insertion is avoided, and the damage of the patient is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive effort.
The structures, ratios, sizes, and the like shown in the present specification are only used for matching with the contents disclosed in the specification, so as to be understood and read by those skilled in the art, and are not used to limit the conditions that the present invention can be implemented, so that the present invention has no technical significance, and any structural modifications, changes in the ratio relationship, or adjustments of the sizes, without affecting the effects and the achievable by the present invention, should still fall within the range that the technical contents disclosed in the present invention can cover.
FIG. 1 is a schematic flow chart of a method for intratumoral irradiation implantation according to an embodiment of the present invention;
FIG. 2 is a schematic view of a position relationship between the implanting needle and the light source when detecting the position of the implanting needle according to the embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an optical guiding device for intratumoral irradiation implantation according to an embodiment of the present invention;
fig. 4 is a schematic structural view of an insertion needle according to an embodiment of the present invention.
In the figure:
1-inserting a planting needle; 2-an external detection device; 3-a needle head; 4-needle tail; 5-reflecting points; 6-a light source; 7-night vision camera; 8-a filter; 9-imaging device.
Detailed Description
The present invention is described in terms of particular embodiments, other advantages and features of the invention will become apparent to those skilled in the art from the following disclosure, and it is to be understood that the described embodiments are merely exemplary of the invention and that it is not intended to limit the invention to the particular embodiments disclosed. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 and 2, an embodiment of the present invention provides an intratumoral irradiation implantation method, including:
firstly, scanning a patient by using a CT (computed tomography), uploading a CT image to an upper computer, reconstructing images of the outline, the tumor and normal organs around the tumor of a human body in the front-back direction, the left-right direction and the up-down direction of the human body in the upper computer, uploading a scanning result to the upper computer, then shooting the patient by using a camera, shooting the images of the front-back direction, the left-right direction and the up-down direction of the human body, uploading the shooting result to the upper computer, integrating the scanning result and the shooting result by the upper computer, constructing a 3D (three-dimensional) image of the tumor, the normal organs and the outline of the human body of the patient, and displaying the constructed 3D image of the human body on a display screen so as to display the outline;
setting the insertion position, the insertion angle and the distribution of the insertion needle, and displaying a virtual interpolation needle on a human body 3D image;
detecting the real position of the insertion needle, arranging at least two reflective points on the insertion needle, inserting the insertion needle into a patient, arranging a light source outside, wherein the light source emits infrared light to the area of the human body, arranging two cameras outside, reflecting the light emitted by the light source by the reflective points and then collecting the light by the cameras, connecting the cameras with a filter, filtering the reflected light and then collecting the light by the cameras, uploading the detected reflective information to an upper computer by the cameras, reconstructing the spatial position and angle of the insertion needle by the upper computer and displaying a real-time image of the spatial position relationship between the insertion needle and the tumor on a 3D image of the human body;
and fourthly, operating the inserting needle by the doctor according to experience, and inserting the inserting needle into the tumor according to the virtual inserting position on the human body 3D image and the real-time image of the spatial position relation between the inserting needle and the tumor.
In the above implementation steps, the light source is an LED or LD, the emitted light is infrared light or near-infrared light, the wavelength of the light is light wave in the range of 605nm to 1500nm, the light in the wavelength range can avoid the interference of visible light, and the camera for observation can use a night vision camera to well observe the infrared light.
Before the interpolation needle is inserted in the embodiment, the patient needs to be scanned by CT, the front and back direction, the left and right direction and the up and down direction of the human body are scanned, so that a 3D image of the tumor and the surrounding organs in the patient is formed conveniently, the outline of the human body is shot by a camera in the process, the shooting result also comprises the images of the front and back direction, the left and right direction and the up and down direction of the human body, all the shooting results are integrated together by image integration software to form a 3D image of the human body, and the image can display the outline of the patient, the tumor and the surrounding organs, so that image guidance is provided for the subsequent processing process.
The 3D image of the human body formed in the process is imported into a commercial software-radiotherapy planning system used by a hospital, the system analyzes and processes the 3D image of the human body, sets the position and the angle of the interpolation needle inserted into the tumor and the distribution condition of the interpolation needle on the tumor, constructs a virtual image of the interpolation needle on the image, displays the final distribution of the interpolation needle, and provides guidance for the operation of a doctor.
In the present embodiment, since the interpolation needle used is provided with the reflection point and the spatial position of the reflection point can be displayed by a specific display device (night vision camera), the position of the interpolation needle can be known even after the interpolation needle is inserted into the human body. The three-dimensional human body three-dimensional imaging system comprises at least two reflecting points, at least two night vision cameras for observing the reflecting points, and different observation angles of the two cameras, so that the spatial position and the angle of the reflecting points can be obtained, and the spatial position and the angle are displayed on a 3D image of a human body to display a real-time image of the spatial position relation between an inserting needle and a tumor, and provide guidance for the operation of a doctor.
Finally, the doctor inserts the insertion needle into the tumor according to the virtual insertion position on the human body 3D image and the real-time image of the spatial position relation between the insertion needle and the tumor according to the experience of the doctor in the past, and the irradiation interpolation of the tumor is completed.
According to the embodiment, the human body 3D image of the tumor, the normal organ and the human body outline of the patient is constructed and displayed, and the real-time position of the insertion needle is displayed in the human body 3D image, so that guidance is provided for a doctor to operate the insertion needle, the accuracy of the interpolation process is improved, re-insertion is avoided, and the damage of the patient is reduced.
As shown in fig. 3 and 4, the present embodiment further provides an optical guiding device for intratumoral irradiation implantation, comprising an implantation needle 1 and an external detection device 2, wherein the implantation needle 1 comprises a needle head 3 for inserting into a tumor and a needle tail 4 left outside the tumor, two light-reflecting points 5 are arranged on the needle tail 4, one light-reflecting point 5 is arranged at the interface of the needle tail 4 and the needle head 3, the other light-reflecting point 5 is arranged at the tail part of the needle tail 4, the external detection device 2 comprises a light source 6 and at least two night vision cameras 7, the light source 6 is an LED or an LD, the light emitted by the light source 6 is infrared light or near infrared light, the light is emitted in a pulsating manner and has a wavelength ranging from 650nm to 1500nm, the light source 6 emits light which is captured by the night vision cameras 7 after being reflected by the light-reflecting points 5, wherein the night vision cameras 7 are connected with a filter 9, and the filter 9 is a narrow-band interference, the reflected light passes through a filter 9 and is then collected by a night vision camera 7, the external detection device 2 is connected with an imaging device 8, the imaging device 8 is used for receiving light information measured by the external detection device, and the imaging device displays a visual image of the position of the reflective point 5 on the interpolation needle 1 in the patient.
In the embodiment, after the insertion needle is inserted into the body of a patient, the light source 6 emits light to the direction of the body, the light is reflected by the reflection point 5 and then is collected by the night vision camera 7, the night vision camera 7 is a camera capable of receiving infrared light, the night vision camera 7 is connected with the filter 8, interference of light rays in other wave bands can be eliminated, and the detected reflection information is transmitted to the imaging device 8 by the night vision camera 7.
Because the reflection points 5 are at least provided with two, the night vision cameras 7 for observing the reflection points 5 are also at least two, and the observation angles of the two night vision cameras 7 are different, the space position and angle of the reflection points 5, namely the position and angle of the insertion needle 1 can be obtained, and then the position and angle of the insertion needle 1 are combined with the human body 3D image which is prepared in advance and can display the tumor, normal organs and the external contour of the human body of a patient, and the two images are integrated together, so that the position relation between the insertion needle 1 and the tumor and the surrounding organs in the human body can be displayed in real time, a guiding function is provided for a doctor to operate the insertion needle, the accuracy of an interpolation process is improved, re-insertion is avoided, and the damage.
Although the invention has been described in detail above with reference to a general description and specific examples, it will be apparent to one skilled in the art that modifications or improvements may be made thereto based on the invention. Accordingly, such modifications and improvements are intended to be within the scope of the invention as claimed.
Claims (9)
1. An intratumoral irradiation implantation method, comprising the steps of:
s100, constructing a human body 3D image to display the outer contour, the tumor and the surrounding organs of the patient;
s200, setting the insertion position, the insertion angle and the distribution of the insertion needle, and displaying a virtual interpolation needle on the human body 3D image;
s300, detecting the real position of the inserting needle, and displaying the real inserting needle, the tumor and surrounding organs on the human body 3D image so as to display a real-time image of the spatial position relation between the inserting needle and the tumor;
s400, the doctor operates the inserting needle according to experience, and inserts the inserting needle into the tumor according to the virtual inserting position on the 3D image of the human body and the real-time image of the spatial position relation between the inserting needle and the tumor.
2. The intratumoral irradiation implantation method according to claim 1, characterized in that: the specific construction steps of the human body 3D image are as follows:
s101, scanning a patient by using a CT (computed tomography), uploading a CT image to an upper computer, and reconstructing images of the outline of the human body, the tumor and normal organs around the tumor in the front-back direction of the human body, the left-right direction of the human body and the up-down direction of the human body in the upper computer;
s102, shooting a patient by using a camera, wherein the shooting comprises the front and back direction of a human body, the left and right direction of the human body and the up and down direction of the human body, and uploading a shooting result to the upper computer;
s103, integrating the scanning result and the shooting result by the upper computer to construct a 3D image of the tumor, the normal organ and the human body outline of the patient, and displaying the 3D image on a display screen.
3. The intratumoral irradiation implantation method according to claim 1, characterized in that: the specific process of the insertion needle position detection is as follows:
s301, arranging a light reflecting point on the inserting needle;
s302, inserting the insertion needle into a patient;
s303, externally arranging a light source, wherein the light source emits light;
s304, reflecting light by the light reflecting points on the inserting needle;
s305, detecting light reflected by a light reflecting point on the insertion needle, and determining the spatial position and the angle of the insertion needle in the patient body based on the externally detected light.
4. The intratumoral irradiation implantation method according to claim 3, characterized in that: the light source is an LED or an LD, the light is infrared light or near-infrared light, the wavelength of the light is light wave in the range of 605nm to 1500nm, and the number of the reflecting points is at least two.
5. The intratumoral irradiation implantation method according to claim 3, wherein the light detection process comprises: the two cameras are arranged on the outer portion of the device and form a certain angle, light emitted by the light source is collected by the cameras after being reflected by the reflection points, the cameras are connected with the filters, and the reflected light is collected by the cameras after being filtered.
6. The intratumoral irradiation implantation method according to claim 5, wherein the reflected light information detected by the camera is uploaded to the upper computer, and the spatial position and angle of the implantation needle are reconstructed by the upper computer.
7. An optical guiding device for intratumoral irradiation implantation, characterized by comprising an implantation needle (1) and an external detection device (2), wherein the implantation needle (1) comprises a needle head (3) for inserting into a tumor and a needle tail (4) left outside the tumor, two light reflecting points (5) are arranged on the needle tail (4), one of the light reflecting points (5) is arranged at the boundary of the needle tail (4) and the needle head (3), and the other light reflecting point (5) is arranged at the tail part of the needle tail (4);
the external detection device (2) comprises a light source (6) and at least two night vision cameras (7), the light source (6) emits light, the light is captured and collected by the night vision cameras (7) after being reflected by the reflective points (5), the external detection device (2) is connected with an imaging device (8), the imaging device (8) is used for receiving light information measured by the external detection device, and the imaging device displays a visual image of the positions of the reflective points (5) on the interpolation needle (1) in a patient body.
8. An optical guiding device for intratumoral irradiation implantation according to claim 7, characterized in that said light source (6) is an LED or LD, the light emitted by said light source (6) being infrared or near infrared light and having a wavelength in the range 650nm to 1500 nm.
9. An optical guiding device for intratumoral irradiation implantation according to claim 7, characterized in that a filter (9) is connected to the night vision camera (7), the filter (9) being a narrow-band interference filter, the reflected light passing through the filter (9) before being collected by the night vision camera (7).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911417061.7A CN111167020A (en) | 2019-12-31 | 2019-12-31 | Tumor internal irradiation transplanting method and optical guiding device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911417061.7A CN111167020A (en) | 2019-12-31 | 2019-12-31 | Tumor internal irradiation transplanting method and optical guiding device thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111167020A true CN111167020A (en) | 2020-05-19 |
Family
ID=70647422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911417061.7A Pending CN111167020A (en) | 2019-12-31 | 2019-12-31 | Tumor internal irradiation transplanting method and optical guiding device thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111167020A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN203195768U (en) * | 2013-03-15 | 2013-09-18 | 应瑛 | Operation guidance system |
EP2716252B1 (en) * | 2012-10-05 | 2015-12-23 | Diego Dall'Alba | System and method for guiding the manual insertion of a needle into the body of a patient. |
CN106139423A (en) * | 2016-08-04 | 2016-11-23 | 梁月强 | A kind of image based on photographic head guides seeds implanted system |
WO2017017556A1 (en) * | 2015-07-28 | 2017-02-02 | Koninklijke Philips N.V. | Workflow of needle tip identification for biopsy documentation |
CN106937884A (en) * | 2017-04-14 | 2017-07-11 | 苏州影睿光学科技有限公司 | Operation guiding system based near infrared imaging |
CN106999250A (en) * | 2014-12-17 | 2017-08-01 | 库卡罗伯特有限公司 | System for the medical treatment of robot assisted |
CN206508013U (en) * | 2016-08-17 | 2017-09-22 | 北京柏惠维康医疗机器人科技有限公司 | Orientable sting device |
CN108969100A (en) * | 2017-05-31 | 2018-12-11 | 格罗伯斯医疗有限公司 | Surgical operation robot system |
-
2019
- 2019-12-31 CN CN201911417061.7A patent/CN111167020A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2716252B1 (en) * | 2012-10-05 | 2015-12-23 | Diego Dall'Alba | System and method for guiding the manual insertion of a needle into the body of a patient. |
CN203195768U (en) * | 2013-03-15 | 2013-09-18 | 应瑛 | Operation guidance system |
CN106999250A (en) * | 2014-12-17 | 2017-08-01 | 库卡罗伯特有限公司 | System for the medical treatment of robot assisted |
WO2017017556A1 (en) * | 2015-07-28 | 2017-02-02 | Koninklijke Philips N.V. | Workflow of needle tip identification for biopsy documentation |
CN106139423A (en) * | 2016-08-04 | 2016-11-23 | 梁月强 | A kind of image based on photographic head guides seeds implanted system |
CN206508013U (en) * | 2016-08-17 | 2017-09-22 | 北京柏惠维康医疗机器人科技有限公司 | Orientable sting device |
CN106937884A (en) * | 2017-04-14 | 2017-07-11 | 苏州影睿光学科技有限公司 | Operation guiding system based near infrared imaging |
CN108969100A (en) * | 2017-05-31 | 2018-12-11 | 格罗伯斯医疗有限公司 | Surgical operation robot system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110996836B (en) | System and method for processing orthodontic appliances by optical coherence tomography | |
CA2761844C (en) | Quantitative endoscopy | |
JP4316876B2 (en) | 3D planning target volume | |
JP2022062209A (en) | Intraoral scanner with dental diagnostics capabilities | |
US9330490B2 (en) | Methods and systems for visualization of 3D parametric data during 2D imaging | |
CN107529968A (en) | For observing the device of cavity interior | |
US20130162775A1 (en) | Apparatus and method for endoscopic 3D data Collection | |
CN108430306A (en) | System and method for using laser speckle contrast Imaging fast to check vascular system and particle stream | |
CN107949337A (en) | The system and method for guiding cutting tissue | |
CN110251047A (en) | The quantitative three-dimensional imaging and printing of surgery implant | |
WO2010090673A1 (en) | Method and apparatus for depth-resolved fluorescence, chromophore, and oximetry imaging for lesion identification during surgery | |
JP2008522761A (en) | Systems and methods for normalized fluorescence or bioluminescence imaging | |
CA2872094A1 (en) | Videographic display of real-time medical treatment | |
CA2923461A1 (en) | System and method for light based lung visualization | |
Robb | 3-D visualization in biomedical applications | |
KR102009558B1 (en) | Goggle system for image guide surgery | |
JP4350226B2 (en) | 3D image processing device | |
JPH11104072A (en) | Medical support system | |
CN105208909A (en) | Method and device for stereoscopic depiction of image data | |
CN110720985A (en) | Multi-mode guided surgical navigation method and system | |
CN111167020A (en) | Tumor internal irradiation transplanting method and optical guiding device thereof | |
CN104188628A (en) | Three-dimensional optical molecular image navigation system and three-dimensional optical molecular image navigation method | |
KR20190069751A (en) | Point based registration apparatus and method using multiple candidate points | |
KR101977650B1 (en) | Medical Image Processing Apparatus Using Augmented Reality and Medical Image Processing Method Using The Same | |
KR102208577B1 (en) | Medical Image Processing Apparatus and Medical Image Processing Method for Surgical Navigator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200519 |
|
RJ01 | Rejection of invention patent application after publication |