CN106420057B - PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof - Google Patents

PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof Download PDF

Info

Publication number
CN106420057B
CN106420057B CN201611046753.1A CN201611046753A CN106420057B CN 106420057 B CN106420057 B CN 106420057B CN 201611046753 A CN201611046753 A CN 201611046753A CN 106420057 B CN106420057 B CN 106420057B
Authority
CN
China
Prior art keywords
imaging
pet
dimensional
fluorescent
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611046753.1A
Other languages
Chinese (zh)
Other versions
CN106420057A (en
Inventor
翟晓晖
谢肇恒
于泽宽
周坤
李素莹
田涧
曾海宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Arrays Medical Imaging Corp
Original Assignee
Beijing Arrays Medical Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Arrays Medical Imaging Corp filed Critical Beijing Arrays Medical Imaging Corp
Priority to CN201611046753.1A priority Critical patent/CN106420057B/en
Publication of CN106420057A publication Critical patent/CN106420057A/en
Application granted granted Critical
Publication of CN106420057B publication Critical patent/CN106420057B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/482Diagnostic techniques involving multiple energy imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

The invention discloses a PET-fluorescence bimodal intra-operative navigation imaging system and an imaging method thereof. The invention adopts a space registration device to form a three-dimensional surface contour image of an imaging sample; the PET imaging device acquires a three-dimensional PET image of an internal structure of an imaging sample; performing image registration fusion on the three-dimensional surface contour image and the three-dimensional PET image to obtain a three-dimensional fusion image of the imaging target, wherein the three-dimensional fusion image comprises a surface contour and an internal structure; the fluorescence imaging device acquires a two-dimensional fluorescence image of an imaging sample in real time; projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along a normal axis of an imaging surface of a fluorescent imaging device, and displaying depth information of an imaging target on the two-dimensional fluorescent image; the method is applied to tumor surgery, and a three-dimensional PET image and a two-dimensional fluorescent image are fused in a computer, so that complementation of two imaging modes is realized, tumors with any depth are accurately positioned, more accurate tumor position information is provided for patients, and tumor R0 excision is realized.

Description

PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof
Technical Field
The invention relates to the field of biomedical imaging, in particular to a PET-fluorescence bimodal intraoperative navigation imaging system and an imaging method thereof.
Background
Currently, surgical treatment is the primary treatment for most tumors, with about 90% of tumors using surgery as a diagnostic and staging tool, and patient postoperative survival and quality of life are closely related to the degree of surgical resection thoroughness. As early as 2003, the professor John v.frankioni at the national academy of sciences harvard, j.biomed.opt, published paper indicated that imaging equipment suitable for the surgeon was lacking in cancer surgery. Until now, most surgeons still can only make judgment in surgery through the appearance, histopathological features and personal experience of the tumor, and cannot make objective and accurate diagnosis on the size, boundary and whether the tumor is completely cleared.
The nuebel chemical winner Qian Yongjian in 2008 teaches that tumor tissue of a mouse is excised by guiding with a fluorescence microscope, and proposes a concept of optical molecular image surgical navigation, and related researches are published in PNAS, the national academy of sciences in 2010. With further development and improvement of the related technology, an intraoperative fluorescence imaging system suitable for clinic is developed by the German and Netherlands combined research team and is first applied to tumor resection operation of ovarian cancer patients in the Netherlands in 2011. The automated institute of China academy of sciences team starts to autonomously develop an intraoperative optical molecular image operation navigation system in 2012. The system can detect the position of a tumor focus in an operation with high sensitivity and high precision, objectively outline the boundary of the tumor focus and effectively detect the tumor residue resected in the operation. In 2013, john V.Frangini professor and the university of Lepton medical center of Netherlands, surgeon Alexander L.Vahrmeijer professor were filed above Nat.Rev.Clin.Oncol, and near infrared fluorescence imaging techniques have been shown to provide effective assistance to clinicians. The novel intraoperative fluorescence imaging technology is utilized to assist the tumor surgeon to carry out accurate surgical excision of tumor focus, and becomes an important means for intraoperative tumor treatment.
In summary, fluorescence imaging is an effective imaging means for intra-operative navigation. However, single-mode fluorescence imaging is limited by the strong scattering and absorption of photons by tissue, and signals deep in the tissue are difficult to penetrate to the surface, so that sensitivity to deep tumors is low, and only superficial tumors can be effectively detected. Meanwhile, fluorescent substances capable of being applied to operations are difficult to chemically modify, and the problem of poor targeting exists. Detecting shallow imaging depth and poor targeting becomes an important factor limiting fluorescence imaging.
Currently, positron emission computed tomography (Positron Emission Computed Tomography, PET) based surgical navigation systems are under investigation. Most PET operation navigation researches focus on how to quickly fuse a preoperative PET image with an intraoperative CT image to realize PET/CT on-site registration, but the technology is limited by the position change of an intraoperative patient operation part, can be guided only in the early stage of operation, and cannot be guided in real time in operation; in addition, scientists in the netherlands and swiss use gamma detectors to count and measure positron (beta+) nuclides as a guide for real-time tumor resection during surgery, but the technology has limited auxiliary effect on doctors during surgery and limited subsequent further popularization due to poor positioning of tumor sites.
PET imaging is not affected by tissue depth, has better targeting to tumors, and can effectively make up the limitations of fluorescence imaging and autonomous surgical tumor excision of patients. The PET-fluorescence bimodal navigation makes up the imaging defects of PET and fluorescence, and is a potential imaging means aiming at intra-operative navigation.
Disclosure of Invention
The fluorescence imaging has high sensitivity and high imaging speed, but the imaging depth is shallow, so that the three-dimensional positioning is difficult; the PET imaging detection depth is deep, the three-dimensional positioning can be realized, the targeting to tumor tissues is good, the two are complementary in technology, and the imaging system combining PET imaging and fluorescence imaging is provided by the invention and is used for tumor surgical navigation.
The invention aims to provide a PET-fluorescence bimodal navigation imaging system.
The PET-fluorescence bimodal navigation imaging system of the invention comprises: PET imaging device, fluorescence imaging device, space registering device, mechanical control frame, imaging bed, computer and display device; wherein the imaging sample is placed on an imaging couch, the imaging couch being arranged on a mechanical control stand; the PET imaging device, the fluorescence imaging device and the space registering device are respectively arranged on the mechanical control frame and face the imaging sample; the PET imaging device, the fluorescence imaging device, the space registering device, the mechanical control frame and the display device are respectively connected to the computer through data wires; the space registration device adopts a plurality of cameras with different angles, and forms a three-dimensional surface contour image of an imaging sample according to a binocular vision principle; the PET imaging device acquires a three-dimensional PET image of an internal structure of an imaging sample; according to the spatial registration relation between the spatial registration device calibrated in advance and the PET imaging device, performing image registration fusion on the three-dimensional surface contour image and the three-dimensional PET image to obtain a three-dimensional fusion image of an imaging target, wherein the three-dimensional fusion image comprises a surface contour and an internal structure; the fluorescence imaging device acquires a two-dimensional fluorescence image of the imaging sample in real time and displays the two-dimensional fluorescence image on the display device; projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along a normal axis of an imaging surface of a fluorescent imaging device, and obtaining depth information of an imaging target on the two-dimensional fluorescent image; the imaging angle of the fluorescent imaging device is changed, and the depth information of the imaging target from the imaging sample surface under different angles is obtained in real time.
The mechanical control frame is fixed on the ground and provides lifting, rotating, translating and other functions for the PET imaging device, the fluorescence imaging device, the space registering device and the imaging bed which are connected with the mechanical control frame. The mechanical control frame includes: the device comprises a fixing frame, a PET connecting arm, a displacement device, a fluorescent mechanical arm and a positioning camera mounting frame; wherein, the fixing frame is fixed on the bottom surface; the imaging bed is arranged on the fixed frame through a displacement device, the displacement device is connected to a computer, and the computer controls the lifting and the translation of the imaging bed through the displacement device; the PET imaging device is arranged on the fixing frame through a PET connecting arm, the PET connecting arm is connected to the computer, the PET connecting arm rotates in plane by taking an imaging sample as a central shaft during imaging, and the PET imaging device rotates outwards to leave an imaging bed after imaging; the fluorescent imaging device is arranged on the fixed frame through a fluorescent mechanical arm, the fluorescent mechanical arm is connected to a computer, the computer controls the fluorescent imaging device to move in real time through the fluorescent mechanical arm, and the position of the origin of a coordinate system of the fluorescent imaging device and the angle of the normal axis of the imaging surface are obtained in real time; the spatial registration device is fixed with the relative position of the fixing frame through the positioning camera mounting frame.
The PET imaging device comprises: a PET detector and a data acquisition system; wherein, at least one pair of PET detectors is symmetrically arranged about the imaging sample, the PET detectors are connected to a data acquisition system, and the data acquisition system is connected to a computer; the PET detector acquires gamma photon signals in an imaging sample body, converts the optical signals into electric signals, and transmits the electric signals to the data acquisition system, and data of the data acquisition system are transmitted to the computer for centralized processing through a data line; the shape of the PET detector is a flat plate structure or an irregular flat plate structure, such as: semi-circular, arc-shaped, or L-shaped, etc.
The fluorescence imaging device includes: the device comprises a fluorescent signal acquisition camera, a white light signal acquisition camera, an illumination light source, an acquisition lens, a filter set, a lens set, a light splitting device and a light shielding box; the fluorescent signal acquisition camera, the white light signal acquisition camera, the optical filter set, the lens set and the light splitting device are arranged in the light shielding box; the periphery of the shading box is fixedly provided with an illumination light source and an excitation light source for providing fluorescent signals; the collecting lens is arranged outside the light shielding box and in front of the optical filter set, the collecting lens is aligned with the imaging sample, data of the imaging sample are collected in real time, signal noise is removed through the optical filter set and the lens set, and signals are transmitted to the light splitting device; the fluorescent signal acquisition camera and the white light signal acquisition camera are vertically arranged and aligned with the light splitting device; the optical signal is divided into two parts by the light splitting device, and one part of the signal passes through the fluorescent filter and is collected by the fluorescent signal collecting camera to obtain fluorescent data; the other part of signals pass through a white light filter and are collected by a white light signal collection camera to obtain natural light data; and transmitting the fluorescence data and the natural light data to a computer through a data line for fusion, and obtaining a two-dimensional fluorescence image with a natural light background in real time.
The space registration device comprises a plurality of positioning cameras with different angles, and the positioning cameras with different angles are fixed with the relative positions of the fixing frame through a positioning camera mounting frame; the positioning cameras with different angles collect two-dimensional surface images of the imaging sample in different directions; the acquired data are transmitted to a computer through a data line, and a three-dimensional surface profile set of an imaging sample is obtained through reconstruction by utilizing the binocular vision principle.
It is another object of the present invention to provide a PET-fluorescent bimodal navigator imaging method.
The PET-fluorescence bimodal navigation imaging method provided by the invention comprises the following steps:
1) Coordinate system matching:
matching the coordinate system of the PET imaging device with the coordinate system of the spatial registration device, determining that the position relationship is unchanged, and obtaining a calibrated spatial registration relationship between the spatial registration device and the PET imaging device;
2) The space registration device adopts a plurality of cameras with different angles, and forms a three-dimensional surface contour image of an imaging sample according to a binocular vision principle; meanwhile, the PET imaging device acquires a three-dimensional PET image of the internal structure of an imaging sample;
3) According to the spatial registration relation between the spatial registration device calibrated in advance and the PET imaging device, performing image registration fusion on the three-dimensional surface contour image and the three-dimensional PET image to obtain a three-dimensional fusion image of an imaging target, wherein the three-dimensional fusion image comprises a surface contour and an internal structure;
4) The fluorescence imaging device acquires a two-dimensional fluorescence image of the imaging sample in real time and displays the two-dimensional fluorescence image on the display device; and obtaining the relation of the coordinate system of the fluorescent imaging device relative to the coordinate system of the PET imaging device and the angle of the normal axis of the imaging surface of the fluorescent imaging device under the coordinate system of the PET imaging device in real time;
5) Projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along a normal axis of an imaging surface of a fluorescent imaging device, and displaying depth information of an imaging target on the two-dimensional fluorescent image; the imaging angle of the fluorescent imaging device is changed, and the depth information of the imaging target from the imaging sample surface under different angles is obtained in real time.
Wherein, in step 2), a plurality of positioning cameras with different angles collect two-dimensional surface images of the imaging sample in a plurality of different directions; the acquired data are transmitted to a computer through a data line, and a three-dimensional surface profile set of an imaging sample is obtained through reconstruction by utilizing the binocular vision principle. At least one pair of PET detectors are symmetrically arranged about the imaging sample, the PET detectors are connected to a data acquisition system, and the data acquisition system is connected to a computer; the PET detector collects gamma photon signals in an imaging sample body, converts the optical signals into electric signals and transmits the electric signals to the data acquisition system, and data of the data acquisition system are transmitted to the computer for centralized processing through a data line.
In the step 4), an illumination light source is fixed around the shading box, and an excitation light source for providing fluorescence signals is provided; the collecting lens is arranged outside the light shielding box and in front of the optical filter set, the collecting lens is aligned with the imaging sample, data of the imaging sample are collected in real time, signal noise is removed through the optical filter set and the lens set, and signals are transmitted to the light splitting device; the fluorescent signal acquisition camera and the white light signal acquisition camera are vertically arranged and aligned with the light splitting device; the optical signal is divided into two parts by the light splitting device, and one part of the signal passes through the fluorescent filter and is collected by the fluorescent signal collecting camera to obtain fluorescent data; the other part of signals pass through a white light filter and are collected by a white light signal collection camera to obtain natural light data; and transmitting the fluorescence data and the natural light data to a computer through a data line for fusion, and obtaining a two-dimensional fluorescence image with a natural light background in real time.
The imaging system is applied to tumor resection operation, an imaging sample is an operation area, and an imaging target is tumor; obtaining a two-dimensional fluorescence image through fluorescence imaging, and carrying out real-time imaging and imaging aiming at superficial tumor of an operation area, so as to accurately cut off superficial tumor tissues; imaging deep tumors in an operation area through three-dimensional PET imaging; the space registration device forms a three-dimensional surface contour image, and the three-dimensional surface contour image and the three-dimensional PET image are subjected to image registration fusion to obtain a three-dimensional fusion image of the operation area, wherein the three-dimensional fusion image comprises a surface contour and an internal structure; projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along the normal axis of the imaging surface of the fluorescent imaging device, and displaying depth information of the tumor from the surface of the operation area under different angles on the two-dimensional fluorescent image in real time; guiding the patient to the next surgical excision direction and depth. When the tissue is dissected, the deep tumor is exposed to the superficial surface, again leading to precise resection according to fluorescence. By means of the circulation, multiple PET imaging can be performed according to operation requirements. Thus, the dual-mode advantage complementation in the operation is realized, the doctor is helped to accurately position the focus in the operation, the wound area is reduced, and the treatment effect of the cancer operation is improved.
The invention has the advantages that:
(1) The unified mechanical control system controls the PET imaging device and the fluorescence imaging device, so that the operation is convenient, the degree of freedom is high, and the imaging can be specially performed on an operation part;
(2) The PET imaging device adopts a flat panel detector suitable for rapid intraoperative imaging, and the spatial position of the detector is freely adjusted according to the focus area;
(3) Imaging and positioning the superficial tumor by fluorescence imaging, and precisely cutting off the superficial tumor;
(4) PET imaging aims at accurate positioning and imaging of deep tumors, and identifies deep tumors which cannot be resolved by human eyes;
(5) Based on the space registration device, the three-dimensional PET image and the two-dimensional fluorescence image are fused in a computer, so that complementation of two imaging modes is realized, and tumors with any depth are accurately positioned;
(6) The three-dimensional fusion image and the two-dimensional fluorescence image are combined in the operation to guide the real-time image, so that more accurate tumor position information is provided for a patient, and tumor R0 excision is realized.
Drawings
FIG. 1 is a schematic diagram of one embodiment of a PET-fluorescence bimodal navigator imaging system of the present invention;
FIG. 2 is a schematic diagram of the motion state of a PET imaging device of one embodiment of a PET-fluorescence bimodal navigator imaging system of the present invention, wherein (a) is in an as-imaged state, (b) is in a supination state, and (c) is in an intra-operative state;
FIG. 3 is a schematic diagram of a PET imaging device of one embodiment of a PET-fluorescence bimodal navigator imaging system of the present invention;
FIG. 4 is a schematic diagram of a fluorescence imaging device of one embodiment of a PET-fluorescence bimodal navigator imaging system of the present invention;
FIG. 5 is a schematic view of the interior of a light box of a fluorescence imaging device of one embodiment of a PET-fluorescence bimodal navigator imaging system of the present invention;
fig. 6 is a schematic diagram of matching the coordinate system of the PET imaging device with the coordinate system of the spatial registration device in one embodiment of the PET-fluorescence bimodal navigation imaging method according to the present invention, wherein (a) is a schematic diagram of the placement of feature registration points on the imaging couch, (b) is a three-dimensional PET image for matching, and (c) is a three-dimensional surface profile image of the spatial registration device for matching.
Detailed Description
The invention will be further illustrated by way of example with reference to the accompanying drawings.
As shown in fig. 1, the PET-fluorescent bimodal navigation imaging system of the present embodiment includes: a PET imaging device 2, a fluorescence imaging device 3, a space registration device 4, a mechanical control frame 1, an imaging bed 5, a computer 6 and a display device 7; wherein the imaging sample is placed on an imaging bed 5, which is mounted on a mechanical control rack 1; the PET imaging device 2, the fluorescence imaging device 3 and the space registration device 4 are respectively arranged on the mechanical control frame 1 and face the imaging sample; the PET imaging device 2, the fluoroscopic imaging device 3, the spatial registration device 4, the mechanical control stand 1, and the display device 7 are connected to the computer 6 through data lines, respectively.
As shown in fig. 1, the mechanical control stand includes: the device comprises a fixing frame 11, a PET connecting arm 12, a displacement device 13, a fluorescent mechanical arm 14 and a positioning camera mounting frame; wherein the fixing frame 11 is fixed on the bottom surface; the imaging bed 5 is arranged on the fixed frame 11 through a displacement device 13, the displacement device 11 is connected to the computer 6, and the lifting and the translation of the imaging bed are controlled through the displacement device; the PET imaging device is arranged on the fixed frame 11 through a PET connecting arm 12, the PET connecting arm 12 is connected to the computer 6, and the PET connecting arm rotates in plane by taking an imaging sample as a central axis during imaging, and rotates out of plane to leave an imaging bed after the imaging is finished; the fluorescence imaging device is mounted on a fixed frame by a fluorescence mechanical arm 14, and the fluorescence mechanical arm 14 is connected to the computer 6.
As shown in fig. 2, during PET imaging, the PET imaging device 2 selects a suitable imaging position and imaging angle under the control of the PET connecting arm 12 by the opposite PET detector 21, and after PET imaging is finished, the PET imaging device 2 is rotated away from the surgical position as shown in fig. 2 (a) and (b), so as to avoid interference with the operation of the patient. The PET link arm 12 is a C-arm.
In the embodiment, the PET-fluorescence bimodal navigation imaging system is applied to navigation imaging in operation, and the process comprises the following four steps:
1) Before operation, horizontally placing a patient on an examination bed, and injecting a radioactive tracer and a fluorescent developer into the patient when a suspected micro focus is encountered in the operation process;
2) After the medicine enters the internal circulation of the patient, the PET connecting arm is rotated to the upper part of the patient, as shown in fig. 2 (a) and (b), the rotation acquisition of the radioactive image is started by taking the lying direction of the patient as a central axis, the acquisition track can be round or elliptical, the real-time processing of the data is performed by a computer, and the patient is taken as an imaging sample to image the patient;
3) After the acquisition is completed, the PET connecting arm is moved away, and under the guidance of the display device 7, a doctor is assisted to find a suspected focus of the deep layer of the target organ, and the doctor is guided to cut into the organ to a designated depth;
4) The fluorescence signal acquisition camera is utilized to perform real-time phenomenon on the cut tissues, so that doctors are guided to cut away tiny focuses one by one, lymph node cleaning is performed, and accurate tumor operation R0 cutting is realized.
As shown in fig. 3, the PET imaging device 2 includes: a PET detector 21 and a data acquisition system 22; wherein, a pair of PET detectors 21 are symmetrically arranged, the PET detectors 21 are connected to a data acquisition system 22, and the data acquisition system 22 is connected to the computer 6; the PET detector 21 is shaped as an irregular flat plate structure.
As shown in fig. 4 and 5, the fluorescence imaging apparatus 3 includes: a fluorescent signal acquisition camera 34, a white light signal acquisition camera 35, an illumination light source 33, an acquisition lens 32, a filter set 36, a lens set 37, a light splitting device 38 and a light shielding box 31; wherein the fluorescent signal acquisition camera 34, the white light signal acquisition camera 35, the optical filter set 36, the lens set 37 and the light splitting device 38 are arranged in the light shielding box 31; an illumination light source 33 is fixed around the light shielding box 31, an excitation light source for providing fluorescent signals is provided, the acquisition lens 32 acquires data of an imaging sample in real time, signal noise is removed through the optical filter set 36 and the lens set 37, and the signals are transmitted to the light splitting device 38; the fluorescent signal acquisition camera 34 and the white light signal acquisition camera 35 are vertically arranged and aligned to the light splitting device 38, and the light splitting device 38 adopts a refractive mirror; the optical signal is divided into two parts by the light splitting device 38, and one part of the signal passes through the fluorescent filter and is collected by the fluorescent signal collecting camera 34 to obtain fluorescent data; the other part of signals pass through the white light filter and are collected by the white light signal collecting camera 35 to obtain natural light data; the fluorescence data and the natural light data are transmitted to a computer 6 through a data line for fusion, and a two-dimensional fluorescence image with a natural light background is obtained in real time.
In this embodiment, the spatial registration device 4 is mounted on the PET detector 21 by positioning the camera mount so as to be fixed in position relative to the mount. The spatial registration device includes positioning cameras of different angles of 2 mesh.
As shown in fig. 6, the coordinate system of the PET imaging device is matched with the coordinate system of the spatial registration device, the feature registration point 51 with weak radioactivity is arranged on the imaging bed 5, the resolution of the PET detector 21 and the positioning camera of the spatial registration device 4 are inconsistent, the three-dimensional PET image shown in fig. 6 (b) and the three-dimensional surface contour image of the spatial registration device 4 shown in fig. 6 (c) need to be rigidly transformed in the registration process to achieve matching fusion of bimodal images, if the spatial position of the feature point in the three-dimensional PET image is x= (x) 1 ,x 2 ,x 3 ,x 4 ) The spatial position in the three-dimensional surface profile image is y= (y) 1 ,y 2 ,y 3 ,y 4 ) The rotation matrix R and the translation matrix T can be solved through a rigid body transformation formula y=Rx+T, and the obtained two matrices R and T can enable the three-dimensional surface profile image and the PET image to achieve registration fusion.
Finally, it should be noted that the purpose of the disclosed embodiments is to aid in further understanding of the invention, but those skilled in the art will appreciate that: various alternatives and modifications are possible without departing from the spirit and scope of the invention and the appended claims. Therefore, the invention should not be limited to the disclosed embodiments, but rather the scope of the invention is defined by the appended claims.

Claims (10)

1. A PET-fluorescent bimodal navigator imaging system, the imaging system comprising: PET imaging device, fluorescence imaging device, space registering device, mechanical control frame, imaging bed, computer and display device; wherein the imaging sample is placed on an imaging couch, the imaging couch being arranged on a mechanical control stand; the PET imaging device, the fluorescence imaging device and the space registration device are respectively arranged on the mechanical control frame and face the imaging sample; the PET imaging device, the fluorescence imaging device, the spatial registration device, the mechanical control frame and the display device are respectively connected to the computer through data lines; the space registration device adopts a plurality of cameras with different angles, and forms a three-dimensional surface contour image of an imaging sample according to a binocular vision principle; the PET imaging device acquires a three-dimensional PET image of an internal structure of an imaging sample; according to the spatial registration relation between the spatial registration device calibrated in advance and the PET imaging device, performing image registration fusion on the three-dimensional surface contour image and the three-dimensional PET image to obtain a three-dimensional fusion image of an imaging target, wherein the three-dimensional fusion image comprises a surface contour and an internal structure; the fluorescence imaging device acquires a two-dimensional fluorescence image of the imaging sample in real time and displays the two-dimensional fluorescence image on the display device; projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along a normal axis of an imaging surface of a fluorescent imaging device, and obtaining depth information of an imaging target on the two-dimensional fluorescent image; the imaging angle of the fluorescent imaging device is changed, and the depth information of the imaging target from the imaging sample surface under different angles is obtained in real time.
2. The imaging system of claim 1, wherein the mechanical control mount comprises: the device comprises a fixing frame, a PET connecting arm, a displacement device, a fluorescent mechanical arm and a positioning camera mounting frame; wherein the fixing frame is fixed on the bottom surface; the imaging bed is arranged on the fixing frame through a displacement device, the displacement device is connected to a computer, and the computer controls the lifting and the translation of the imaging bed through the displacement device; the PET imaging device is arranged on the fixing frame through a PET connecting arm, the PET connecting arm is connected to the computer, the PET connecting arm rotates in plane by taking an imaging sample as a central shaft during imaging, and the PET imaging device rotates out of plane to leave an imaging bed after imaging; the fluorescent imaging device is arranged on the fixed frame through a fluorescent mechanical arm, the fluorescent mechanical arm is connected to a computer, the computer controls the fluorescent imaging device to move in real time through the fluorescent mechanical arm, and the position of the origin of a coordinate system of the fluorescent imaging device and the angle of the normal axis of the imaging surface are obtained in real time; the spatial registration device is fixed in relative position with the fixing frame through the positioning camera mounting frame.
3. The imaging system of claim 1, wherein the PET imaging device comprises: a PET detector and a data acquisition system; wherein, at least one pair of PET detectors is symmetrically arranged about the imaging sample, the PET detectors are connected to a data acquisition system, and the data acquisition system is connected to a computer; the PET detector collects gamma photon signals in an imaging sample body, converts the optical signals into electric signals and transmits the electric signals to the data acquisition system, and data of the data acquisition system are transmitted to the computer for centralized processing through a data line.
4. The imaging system of claim 3, wherein the PET detector is shaped as a flat panel structure or an irregular flat panel structure; the irregular flat plate structure adopts a semicircular shape, an arc shape or an L shape.
5. The imaging system of claim 1, wherein the fluorescence imaging device comprises: the device comprises a fluorescent signal acquisition camera, a white light signal acquisition camera, an illumination light source, an acquisition lens, a filter set, a lens set, a light splitting device and a light shielding box; the fluorescent signal acquisition camera, the white light signal acquisition camera, the optical filter set, the lens set and the light splitting device are arranged in the light shielding box; an illumination light source is fixed around the shading box, and an excitation light source for providing fluorescent signals is provided; the collecting lens is arranged outside the light shielding box and in front of the optical filter set, the collecting lens is aligned with the imaging sample, data of the imaging sample are collected in real time, signal noise is removed through the optical filter set and the lens set, and signals are transmitted to the light splitting device; the fluorescent signal acquisition camera and the white light signal acquisition camera are vertically arranged and aligned to the light splitting device; the optical signal is divided into two parts by the light splitting device, and one part of the signal passes through the fluorescent filter and is collected by the fluorescent signal collecting camera to obtain fluorescent data; the other part of signals pass through a white light filter and are collected by a white light signal collection camera to obtain natural light data; and transmitting the fluorescence data and the natural light data to a computer through a data line for fusion, and obtaining a two-dimensional fluorescence image with a natural light background in real time.
6. The imaging system of claim 1, wherein the spatial registration means comprises a plurality of differently angled positioning cameras fixed in relative position to the mount by a positioning camera mount; the positioning cameras with different angles collect two-dimensional surface images of the imaging sample in different directions; the acquired data are transmitted to a computer through a data line, and a three-dimensional surface profile set of an imaging sample is obtained through reconstruction by utilizing the binocular vision principle.
7. A PET-fluorescent bimodal navigator imaging method, characterized in that it comprises the steps of:
1) Coordinate system matching:
matching the coordinate system of the PET imaging device with the coordinate system of the spatial registration device, determining that the position relationship is unchanged, and obtaining a calibrated spatial registration relationship between the spatial registration device and the PET imaging device;
2) The space registration device adopts a plurality of cameras with different angles, and forms a three-dimensional surface contour image of an imaging sample according to a binocular vision principle; meanwhile, the PET imaging device acquires a three-dimensional PET image of the internal structure of an imaging sample;
3) According to the spatial registration relation between the spatial registration device calibrated in advance and the PET imaging device, performing image registration fusion on the three-dimensional surface contour image and the three-dimensional PET image to obtain a three-dimensional fusion image of an imaging target, wherein the three-dimensional fusion image comprises a surface contour and an internal structure;
4) The fluorescence imaging device acquires a two-dimensional fluorescence image of the imaging sample in real time and displays the two-dimensional fluorescence image on the display device; and obtaining the relation of the coordinate system of the fluorescent imaging device relative to the coordinate system of the PET imaging device and the angle of the normal axis of the imaging surface of the fluorescent imaging device under the coordinate system of the PET imaging device in real time;
5) Projecting the three-dimensional fusion image onto a two-dimensional fluorescent image along a normal axis of an imaging surface of a fluorescent imaging device, and displaying depth information of an imaging target on the two-dimensional fluorescent image; the imaging angle of the fluorescent imaging device is changed, and the depth information of the imaging target from the imaging sample surface under different angles is obtained in real time.
8. The imaging method of claim 7, wherein in step 2), a plurality of different-angle positioning cameras acquire two-dimensional surface images of the imaging sample in a plurality of different directions; the acquired data are transmitted to a computer through a data line, and a three-dimensional surface profile set of an imaging sample is obtained through reconstruction by utilizing the binocular vision principle.
9. The imaging method of claim 7, wherein in step 2) at least one pair of PET detectors are symmetrically arranged about the imaged sample, the PET detectors being connected to a data acquisition system, the data acquisition system being connected to a computer; the PET detector collects gamma photon signals in an imaging sample body, converts the optical signals into electric signals and transmits the electric signals to the data acquisition system, and data of the data acquisition system are transmitted to the computer for centralized processing through a data line.
10. The imaging method of claim 7, wherein in step 4), illumination sources are fixed around the light shielding box, excitation sources providing fluorescent signals; the collecting lens is arranged outside the light shielding box and in front of the optical filter set, the collecting lens is aligned with the imaging sample, data of the imaging sample are collected in real time, signal noise is removed through the optical filter set and the lens set, and signals are transmitted to the light splitting device; the fluorescent signal acquisition camera and the white light signal acquisition camera are vertically arranged and aligned with the light splitting device; the optical signal is divided into two parts by the light splitting device, and one part of the signal passes through the fluorescent filter and is collected by the fluorescent signal collecting camera to obtain fluorescent data; the other part of signals pass through a white light filter and are collected by a white light signal collection camera to obtain natural light data; and transmitting the fluorescence data and the natural light data to a computer through a data line for fusion, and obtaining a two-dimensional fluorescence image with a natural light background in real time.
CN201611046753.1A 2016-11-23 2016-11-23 PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof Active CN106420057B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611046753.1A CN106420057B (en) 2016-11-23 2016-11-23 PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611046753.1A CN106420057B (en) 2016-11-23 2016-11-23 PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof

Publications (2)

Publication Number Publication Date
CN106420057A CN106420057A (en) 2017-02-22
CN106420057B true CN106420057B (en) 2023-09-08

Family

ID=58219282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611046753.1A Active CN106420057B (en) 2016-11-23 2016-11-23 PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof

Country Status (1)

Country Link
CN (1) CN106420057B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10751133B2 (en) * 2017-03-31 2020-08-25 Koninklijke Philips N.V. Markerless robot tracking systems, controllers and methods
CN109009201B (en) * 2018-08-31 2023-12-29 北京锐视康科技发展有限公司 Flat PET limited angle sub-image positioning system and positioning method thereof
CN110720985A (en) * 2019-11-13 2020-01-24 安徽领航智睿科技有限公司 Multi-mode guided surgical navigation method and system
CN113952033B (en) * 2021-12-21 2022-04-19 广东欧谱曼迪科技有限公司 Double-source endoscopic surgery navigation system and method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102076259A (en) * 2008-04-26 2011-05-25 直观外科手术操作公司 Augmented stereoscopic visualization for a surgical robot
CN102488493A (en) * 2011-11-15 2012-06-13 西安电子科技大学 Small animal living body multi-mode molecule imaging system and imaging method
CN103610471A (en) * 2013-12-16 2014-03-05 中国科学院自动化研究所 Optical multi-modal imaging system and method
CN103815928A (en) * 2014-03-18 2014-05-28 北京大学 Image registration device and method for multi-model imaging system
CN104000617A (en) * 2014-04-18 2014-08-27 西安电子科技大学 Multi-modal in-vivo imaging system for small animals and small animal imaging method
WO2016127173A1 (en) * 2015-02-06 2016-08-11 The University Of Akron Optical imaging system and methods thereof
WO2016130424A1 (en) * 2015-02-09 2016-08-18 The Arizona Board Of Regents Of Regents On Behalf Of The University Of Arizona Augmented stereoscopic microscopy

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005036322A1 (en) * 2005-07-29 2007-02-15 Siemens Ag Intraoperative registration method for intraoperative image data sets, involves spatial calibration of optical three-dimensional sensor system with intraoperative imaging modality
US20080300478A1 (en) * 2007-05-30 2008-12-04 General Electric Company System and method for displaying real-time state of imaged anatomy during a surgical procedure
EP2501320A4 (en) * 2009-11-19 2014-03-26 Univ Johns Hopkins Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102076259A (en) * 2008-04-26 2011-05-25 直观外科手术操作公司 Augmented stereoscopic visualization for a surgical robot
CN102488493A (en) * 2011-11-15 2012-06-13 西安电子科技大学 Small animal living body multi-mode molecule imaging system and imaging method
CN103610471A (en) * 2013-12-16 2014-03-05 中国科学院自动化研究所 Optical multi-modal imaging system and method
CN103815928A (en) * 2014-03-18 2014-05-28 北京大学 Image registration device and method for multi-model imaging system
CN104000617A (en) * 2014-04-18 2014-08-27 西安电子科技大学 Multi-modal in-vivo imaging system for small animals and small animal imaging method
WO2016127173A1 (en) * 2015-02-06 2016-08-11 The University Of Akron Optical imaging system and methods thereof
WO2016130424A1 (en) * 2015-02-09 2016-08-18 The Arizona Board Of Regents Of Regents On Behalf Of The University Of Arizona Augmented stereoscopic microscopy

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Shuangquan Liu 等.A DUAL MODALITY SYSTEM FOR SIMULTANEOUS FLUORESCENCE AND PET IMAGING OF SMALL ANIMALS.《IEEE TRANSACTIONS ON NUCLEAR SCIENCE》.2011,第58卷(第1期),51-57. *

Also Published As

Publication number Publication date
CN106420057A (en) 2017-02-22

Similar Documents

Publication Publication Date Title
EP3614928B1 (en) Tissue imaging system
US10489964B2 (en) Multimodality multi-axis 3-D imaging with X-ray
CN101023890B (en) Imaging medical technique device
CN106420057B (en) PET-fluorescence bimodal intra-operative navigation imaging system and imaging method thereof
EP1554987B1 (en) Functional navigator
RU2535605C2 (en) Recalibration of pre-recorded images during interventions using needle device
US8041409B2 (en) Method and apparatus for multi-modal imaging
JP6711880B2 (en) Biopsy probe, biopsy support device
US20090018451A1 (en) Dynamic Sampling System and Method for In Vivo Fluorescent Molecular Imaging
EP3259705A1 (en) System and method for positional registration of medical image data
CN109068981B (en) Biological tissue examination device and method thereof
US11058388B2 (en) Method and system for combining microscopic imaging with X-Ray imaging
CN111031918A (en) X-ray imaging apparatus and control method thereof
CN104188628B (en) Three-dimensional optical molecular image navigation system
CN106419837A (en) Fluorescent molecular tomography system based on spiral motivation
CN109044277A (en) Two area's fluorescence computed tomography (SPECT) system of near-infrared
CN206534707U (en) Navigated in a kind of PET fluorescent dual modules state art imaging system
JP2013088386A (en) Medical data processing apparatus and radiation tomography apparatus including the same
CN103300828A (en) Separated type multimode fused three-dimensional imaging system
CN105662354B (en) A kind of wide viewing angle optical molecular tomographic navigation system and method
EP2664279A1 (en) Radiograph display apparatus and method
CN112617879A (en) Intelligent tumor examination equipment
CN205795645U (en) A kind of wide viewing angle optical molecular tomographic navigation system
US10921265B2 (en) System and method for cabinet x-ray systems with near-infrared optical system
CN110367931A (en) A kind of light tomography transillumination imaging system based on femtosecond laser

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant