CN104305957B - Wear-type molecular image navigation system - Google Patents

Wear-type molecular image navigation system Download PDF

Info

Publication number
CN104305957B
CN104305957B CN201410433156.9A CN201410433156A CN104305957B CN 104305957 B CN104305957 B CN 104305957B CN 201410433156 A CN201410433156 A CN 201410433156A CN 104305957 B CN104305957 B CN 104305957B
Authority
CN
China
Prior art keywords
image
module
light source
registration
infrared fluorescent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410433156.9A
Other languages
Chinese (zh)
Other versions
CN104305957A (en
Inventor
田捷
迟崇巍
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN201410433156.9A priority Critical patent/CN104305957B/en
Publication of CN104305957A publication Critical patent/CN104305957A/en
Application granted granted Critical
Publication of CN104305957B publication Critical patent/CN104305957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0037Performing a preliminary scan, e.g. a prescan for identifying a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Processing (AREA)

Abstract

A kind of wear-type molecular image navigation system, including: multispectral light source module, for irradiating visible ray and near infrared light to search coverage;Signal acquisition module, for gathering near-infrared fluorescent image and the visible images of imaging object;Head-mounted system supports module, is used for carrying described multispectral light source module and described signal acquisition module, to adjust the irradiation to described search coverage of the described multispectral light source module;Image processing module, for the near infrared light image gathered and visible images are carried out image co-registration, and exports fusion image.According to embodiments of the present invention, effectively achieve the flexible use of equipment in image system application, extend the application space of optical molecular image navigation.

Description

Wear-type molecular image navigation system
Technical field
The present invention relates to a kind of imaging system, particularly a kind of wear-type molecular image navigation system.
Background technology
As new method and the means of noinvasive visible technology, molecular image reflects molecular regulation in itself Change the change of organism physiological molecule level and the change of allomeric function caused.Therefore, on a molecular scale at body (in Vivo) vital movement of research gene, biomacromolecule and cell is a kind of important technology, wherein based on molecular engineering, tomography Imaging technique, optical image technology, the basic research of In Vivo bioluminescent imaging technology of simulation methodology, have become as molecule One of the focus of image area research and difficult point.
Traditional medicine image technology is combined by molecular image equipment with modern molecular biology, it is possible to from cell, molecule Aspect observation physiology or pathological change, have hurtless measure, real-time, live body, high specific, high sensitivity and high-resolution imaging Etc. advantage.Utilize molecular image technology, on the one hand can greatly accelerate the development speed of medicine, study before shortening clinical drug Time;There is provided and diagnose more accurately, make therapeutic scheme most preferably mate the gene mapping of patient, help drugmaker's research and development individual character Change the medicine for the treatment of;On the other hand, can apply at biomedical sector, it is achieved the quantitative analysis of body, image navigation, The targets such as molecule parting.But, profit system in this way is relative complex, and ease of handling and comfort aspect need Improve further.
Therefore the present invention proposes a kind of wear-type molecular image navigation system, by the detection point of the multispectral method excited In sub-image in body target, strengthen the scope of application of application.
Summary of the invention
The invention provides a kind of wear-type molecular image navigation system, including:
Multispectral light source module, for irradiating visible ray and near infrared light to search coverage;
Signal acquisition module, for gathering near-infrared fluorescent image and the visible images of imaging object;
Head-mounted system supports module, is used for carrying described multispectral light source module and described signal acquisition module, to adjust The irradiation to described search coverage of the whole described multispectral light source module;
Image processing module, for the near infrared light image gathered and visible images are carried out image co-registration, and exports Fusion image.
Embodiments of the invention have following technical effect that
1, by wear mode realize molecular image navigate, molecular imaging, while realizing function, improve convenience.
2, use the method for projection imaging can guide operator that areas imaging is carried out pre-judgement, thus add people The function that machine is mutual.
3, the function utilizing speech recognition can facilitate operator to liberate both hands during use system further, Thus it is precisely controlled wear-type molecular image navigation system.
4, owing to using the Eigenvalue Extraction Method of threshold decomposition so that signal-to-background ratio significantly improves, and contributes to operator Real-time and precise operation is guided according to image.
Accompanying drawing explanation
Fig. 1 is the structural representation that head-mounted system according to embodiments of the present invention supports module;
Fig. 2 is the block diagram of the wear-type molecular image navigation system according to the embodiment of the present invention;
Fig. 3 is the image processing method flow chart of the wear-type molecular image navigation system according to the embodiment of the present invention.
Detailed description of the invention
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with specific embodiment, and reference Accompanying drawing, the present invention is described in more detail.
The embodiment of the present invention is based on exciting fluorescence imaging in molecular image, it is provided that a kind of wear-type molecular image navigation System.
Fig. 1 is the structural representation that head-mounted system according to embodiments of the present invention supports module.Fig. 2 is according to the present invention The block diagram of the wear-type molecular image navigation system of embodiment.As in figure 2 it is shown, this wear-type molecular image navigation system is permissible Including multispectral light source module 110, for providing the light of multiple different spectral coverage, in order to irradiate detected object;Optical signalling gathers Module 120, for fluorescence excitation image and the visible images of Real-time Collection detected object;Head-mounted system supports module 130, For adjusting comfortableness when operator wear, and ensure safely and effectively carrying out of imaging;Image processing module 140, is used for Carry out image segmentation, feature extraction, image registration etc. to process, it is achieved visible images and the fusion of fluoroscopic image also export fusion Image.
Next multispectral light source module 110, optical signalling acquisition module 120, head-mounted system will be described respectively in detail Support module 130 and the operation of image processing module 140.
Multispectral light source module 110 can include cold light source 111, near infrared laser 112 and light source coupler 113.Cold Light source 111 is for launching visible ray to detected object.Cold light source 111 can be placed with the first bandpass filter, in order to through ripple The visible ray of a length of 400-650nm.Near infrared laser 113 is configured to the near infrared light that emission center wavelength is such as 785nm. By optical fiber, excitation source can be drawn.It is known to those skilled in the art that the embodiment of the present invention is not limited to above-mentioned Implementation, it is also possible to use other modes well known in the art to launch visible ray and near infrared light.When exciting search coverage Time, based on spectrum separation method, simple optical fiber realize cold light source 111 and the smooth outgoing of near infrared laser 112 simultaneously.Specifically Ground, it will be seen that light source couples at light-emitting window with the light of near-infrared light source outgoing.Light source coupler 113 is set at coupling.Light Source coupler 113 can be to dissipate camera lens, by rectilinear point light source, light source is become cone-beam light, so can expand irradiated area, To realize the excitation source uniform irradiation to search coverage.For example, it is possible to arrange light at the light-emitting window of near infrared laser 112 Learn camera lens, optical lens and laser output reverse coupled, it is achieved the output of the bigger angle of divergence of light source.Machinery can be used solid Fixed method, is fixed together one end and the optical lens of optical fiber, and with head-mounted system, the other end of optical fiber is supported module 130 are connected.
Optical signalling acquisition module 120 can include camera 121, camera lens 122 and coordinate projection device 123.Camera 121 configures For gathering near-infrared fluorescent signal and visible light signal.Wherein, in gatherer process, background is illuminated by cold light source.Example As, can the reference parameter needed near infrared light signals collecting be set as follows: at 800nm, quantum efficiency is higher than 30%, frame speed More than 30fps, image source (that is, the minimum photosensitive unit point image source of camera 121) size is more than 5 microns.Preferably, exist The second bandpass filter is placed, in order to be the near infrared light of 810-870nm through wavelength between camera 121 and camera lens 122.Work as phase When machine 121 operates, coordinate projection device 123 can project a circular contour to search coverage (not shown), and this is taken turns Exterior feature is the maximum magnitude in the visual field, in order to operator obtain the search coverage of system, simultaneously facilitate operator and obtain multispectral Light source module 110 excite scope.
As it is shown in figure 1, head-mounted system supports module 130 can include head-mounted system support 131.Head-mounted system is propped up Frame 131 is used for carrying light source module 110 and signal acquisition module 120.Preferably, head-mounted system support module 130 is all right Including speech recognition and control module 132.Speech recognition controlled module 132 can include mike, voice recognition unit and control Unit (not shown) processed, in order to control multispectral light source module 110, coordinate projection device 123 etc. by the voice of operator The operation of module.Speech recognition technology well known in the art can be used to realize speech recognition and control module 132.
By the most defeated to visible images and the near-infrared fluorescent image of the detected object from optical signalling acquisition module 120 Enter to image processing module 140.Image processing module 140 is processed by backend computer and realizes, and gathers and light source control can also By back-end realization Non-follow control.First image processing module 140 carries out pretreatment to the near-infrared fluorescent image of input, in order to root The characteristic distribution of fluoroscopic image is obtained according to fluorescent specific.Pretreatment can include that noise remove, feature extraction and bad point are mended Repay.It is of course also possible to visible images is carried out pretreatment well known in the art.Threshold segmentation can be utilized near to input Infrared fluorescence image carries out feature extraction.Such as, for image intensity value G/ background noise gray value in near-infrared fluorescent image GnPixel higher than 1.5, is multiplied by 2 by the gray value of this pixel, for G/GnPixel less than 1.5, by this pixel Gray value is divided by 2.Characteristic point can be strengthened according to this threshold segmentation method.The interested of predetermined threshold value is more than for gray value These area-of-interests can be changed into by region by gray level image well known in the art to pseudo color image adjustment algorithm Pseudo color image, thus mark the position of characteristic point and characteristic area further, in order to operator guide according to image Implement operation.Image after image processing module 140 processes is fusion image, has display on a general-purpose computer and projection connects Mouthful, facilitate operator to realize the output display of image.Simultaneously can be by visual signal feedbacks to helmet system, by placing Mirror forth screen realizes the visualization to fusion image.
Then, utilize the fluoroscopic image optical characteristics distribution obtained, the visible images that fluorescence inputs is carried out image and melts Close, thus obtain fusion results image to export.Specifically, fluoroscopic image includes utilizing with the image co-registration of visible images Fluoroscopic image is registrated by the distribution of fluoroscopic image optical characteristics with visible images.This registration operation described in detail below.
The distribution of fluoroscopic image optical characteristics has fluorescent specific, and visible images is a kind of high resolution structures figure Picture.Image registration according to embodiments of the present invention make use of above-mentioned characteristic.When registrating, morphology theory can be used, That revises the distribution of fluoroscopic image optical characteristics minimizes energy function formula so that it is shape is close to image tissue.Following formula can be used (1) registrate.
E ( U ) = | | Δ d U | | 2 + β Σ i = 1 n | | U i - W i | | 2 - - - ( 1 )
In formula (1), d is discrete Laplace operator, and U is position vector, select n surface point as main mark point, pi、aiIt is respectively imaging surface labelling point, Wi=(pi-ai) motion-vector, obtain vector U by minimizing E (U)P, thenFor the position after areal deformation.
In order to obtain fusion image accurate, high-resolution, when registrating, use the image shown in following formula (2) Registration is as registration effect assessment standard.
2 | A ∩ B | | A | + | B | - - - ( 2 )
Wherein, A is visible images Normalized Grey Level value matrix, and B is fluoroscopic image Normalized Grey Level value matrix.Computing is tied Fruit, closer to 1, illustrates that image registration effect is the best.
Fig. 3 shows the flow chart of image processing method according to embodiments of the present invention.As it is shown on figure 3, in step 301, Detection Method in Optical Image Sequences and fluoroscopy image sequence space motion through pretreatment is detected, in order to filter unmatched small position Move frame, obtain Detection Method in Optical Image Sequences M1 and fluoroscopy image sequence M2.
Alternatively, in step 303, the High Resolution Visible Light image sequence M1 obtained for step 301 forms image gold Word tower P1, to reduce data volume, thus improves the real-time of image procossing.Specifically, use gaussian pyramid that image is carried out Down-sampling is to generate i+1 layer according to i-th layer of pyramid.First carry out convolution by gaussian kernel to i-th layer, then delete all Even number line and even column.Certainly, newly obtained image size can become 1/4th of upper level image.In this case, First image expands as original twice in each dimension, and newly-increased row (even number line) is filled with 0.Then use and specify filter Ripple device carries out convolution (actually one filter all expanding as twice on the most one-dimensional) and goes to estimate the near of " loss " pixel Like value.By said process, input picture circulation is performed operation and just can produce whole pyramid.
In step 305, utilize the gradient edge detection method for example with Roberts operator, to the image gold word obtained Tower P1 and fluoroscopy image sequence M2 carries out rim detection, respectively obtains image border E1 and E2.Certainly, in image-capable relatively Step 303 can also be skipped in the case of height, directly Detection Method in Optical Image Sequences M1 and fluoroscopy image sequence M2 is carried out edge inspection Survey.
In step 307, image border E1 and E2 obtained is carried out sparse sampling based on significance respectively.Can use Identical method carries out sparse sampling based on significance respectively to image border E1 and E2, uses that compressed sensing is sparse adopts here Sample technology carries out coefficient sampling to E1 and E2, thus respectively obtains sampling output S1 and S2.
In step 308, the sampling obtaining step 307 output S1 and S2 performs registration.Except use above formula (1) and (2) beyond registrating, it is also possible to use point cloud registering to optimize registration result further.Can join in detail about point cloud registering See " Xue Yaohong etc., cloud data registration and surface subdivision technical research, National Defense Industry Press, 2011 ", repeat no more herein.
Preferably, step 309 can also be included according to the image processing method of the present invention.In step 309, to point cloud registering Result carry out Algorithm Convergence checking, to ensure the reliable and stable of calculating process.
Preferably, the place of step 301,303,305 and 309 can be performed by image GPU or FPGA of small volume Reason, the central processing unit CPU simultaneously using computing capability higher performs step of registration 308, thus is optimizing system further While system performance, the hardware size needed for reduction.
Particular embodiments described above, has been carried out the purpose of the present invention, technical scheme and beneficial effect the most in detail Describe in detail bright, be it should be understood that the specific embodiment that the foregoing is only the present invention, be not limited to the present invention, all Within the spirit and principles in the present invention, any modification, equivalent substitution and improvement etc. done, should be included in the guarantor of the present invention Within the scope of protecting.

Claims (10)

1. a wear-type molecular image navigation system, including:
Multispectral light source module, for irradiating visible ray and near infrared light to search coverage;
Signal acquisition module, for gathering near-infrared fluorescent image and the visible images of imaging object;
Head-mounted system supports module, is used for carrying described multispectral light source module and described signal acquisition module, to adjust State the multispectral light source module range of exposures to described search coverage;
Image processing module, for the near infrared light image gathered and visible images are carried out image co-registration, and exports fusion Image;
Wherein, described image processing module carries out image co-registration include near-infrared fluorescent image and the visible images of collection: The optical characteristics obtaining described near-infrared fluorescent image by making following energy function formula minimize is distributed:
E ( U ) = | | Δ d U | | 2 + β Σ i = 1 n | | U i - W i | | 2
Wherein, d is discrete Laplace operator, and U is position vector, selects n surface point as main mark point, pi、aiRespectively For imaging surface labelling point, Wi=(pi-ai) motion-vector, obtain vector U by minimizing E (U)P, thenFor table Position after facial disfigurement, Δ d represents first derivative, UiRepresent the position vector of i-th labelling point position,Represent mark The imaging surface border of note point p position,Representing the surface-boundary of whole characteristics of image, β represents weight coefficient.
System the most according to claim 1, wherein, described multispectral light source module includes:
Visible light source, for launching visible ray to detected object;
Near infrared laser, for launching near infrared light to detected object;With
Light source coupler;
Wherein, described light source coupler couples described visible ray and near infrared light, and coupling is optically coupled to by simple optical fiber Described head-mounted system supports module.
System the most according to claim 2, wherein, described head-mounted system supports module and includes:
Head-mounted system support, is used for carrying described multispectral light source module and described signal acquisition module;And
Speech control module, for controlling the operation of multispectral light source module, to form the search coverage of expected range.
System the most according to claim 1, wherein, the near-infrared fluorescent image gathered is carried out by described image processing module Feature extraction, including:
For image intensity value G/ background noise gray value GnPixel higher than 1.5, is multiplied by 2 by the gray value of described pixel; For G/GnPixel less than 1.5, by the gray value of described pixel divided by 2.
System the most according to claim 1, wherein, described image processing module to gather near-infrared fluorescent image and can See that light image carries out image co-registration, including by the picture registration degree shown in following formula as registration effect assessment standard:
2 | A ∩ B | | A | + | B |
Wherein, A is visible images Normalized Grey Level value matrix, and B is fluoroscopic image Normalized Grey Level value matrix.
System the most according to claim 4, wherein, described image processing module is come further to the reddest by point cloud registering Outer fluoroscopic image and visible images carry out image co-registration.
7. it is applied to an image processing method for wear-type molecular image navigation system described in claim 1, including:
Detection Method in Optical Image Sequences and near-infrared fluorescent image sequence are carried out space motion detection, in order to filter unmatched small Displacement frame (301);
The described Detection Method in Optical Image Sequences detected through space motion is carried out down-sampling, obtains image pyramid (303);
Use gradient edge detection method, respectively the image pyramid obtained and near-infrared fluorescent image sequence are carried out edge inspection Survey, obtain image border (305);
The image border obtained is carried out respectively sparse sampling based on significance, thus respectively obtains sampling output (307);With And
Sampling output to obtaining performs registration to carry out image co-registration (308).
Method the most according to claim 7, wherein, described registration includes by the picture registration degree shown in following formula as joining Quasi-effect assessment standard:
2 | A ∩ B | | A | + | B |
Wherein, A is visible images Normalized Grey Level value matrix, and B is near-infrared fluorescent image normalization gray value matrix.
Method the most according to claim 7, also includes using point cloud registering further to near-infrared fluorescent image and can See that light image carries out image registration.
Method the most according to claim 9, also includes that the result to point cloud registering carries out Algorithm Convergence checking.
CN201410433156.9A 2014-08-28 2014-08-28 Wear-type molecular image navigation system Active CN104305957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410433156.9A CN104305957B (en) 2014-08-28 2014-08-28 Wear-type molecular image navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410433156.9A CN104305957B (en) 2014-08-28 2014-08-28 Wear-type molecular image navigation system

Publications (2)

Publication Number Publication Date
CN104305957A CN104305957A (en) 2015-01-28
CN104305957B true CN104305957B (en) 2016-09-28

Family

ID=52361385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410433156.9A Active CN104305957B (en) 2014-08-28 2014-08-28 Wear-type molecular image navigation system

Country Status (1)

Country Link
CN (1) CN104305957B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105342561B (en) * 2015-10-09 2017-12-29 中国科学院自动化研究所 The wearable molecular image navigation system of Wireless sound control
US10026202B2 (en) 2015-10-09 2018-07-17 Institute Of Automation, Chinese Academy Of Sciences Wearable molecular imaging navigation system
CN105640481B (en) * 2015-12-31 2019-05-14 东莞广州中医药大学中医药数理工程研究院 A kind of hole key observation device and its acoustic-controlled method with acoustic control light source
CN106037674B (en) * 2016-08-18 2018-10-30 皖江新兴产业技术发展中心 A kind of vein imaging system based on high light spectrum image-forming
CN107374730A (en) * 2017-09-06 2017-11-24 东北大学 Optical operation navigation system
CN109662695A (en) * 2019-01-16 2019-04-23 北京数字精准医疗科技有限公司 Fluorescent molecules imaging system, device, method and storage medium
CN109938700A (en) * 2019-04-04 2019-06-28 济南显微智能科技有限公司 A kind of wear-type IR fluorescence detection device
CN110226974A (en) * 2019-07-08 2019-09-13 中国科学技术大学 A kind of near-infrared fluorescence imaging system based on augmented reality

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60024059T2 (en) * 1999-01-26 2006-07-20 Newton Laboratories, Inc., Woburn DEVICE FOR AUTOFLUORESCENT IMAGING FOR AN ENDOSCOPE
JP4971816B2 (en) * 2007-02-05 2012-07-11 三洋電機株式会社 Imaging device
CN101339653B (en) * 2008-01-30 2010-06-02 西安电子科技大学 Infrared and colorful visual light image fusion method based on color transfer and entropy information
CN102722556B (en) * 2012-05-29 2014-10-22 清华大学 Model comparison method based on similarity measurement
CN103489005B (en) * 2013-09-30 2017-04-05 河海大学 A kind of Classification of High Resolution Satellite Images method based on multiple Classifiers Combination
CN103530038A (en) * 2013-10-23 2014-01-22 叶晨光 Program control method and device for head-mounted intelligent terminal
CN203709999U (en) * 2014-02-07 2014-07-16 王学庆 Headwear venipuncture guide dual-light source system device
CN204072055U (en) * 2014-08-28 2015-01-07 中国科学院自动化研究所 Wear-type molecular image navigation system

Also Published As

Publication number Publication date
CN104305957A (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN104305957B (en) Wear-type molecular image navigation system
EP3226766B1 (en) System and method for image calibration
CN102488493A (en) Small animal living body multi-mode molecule imaging system and imaging method
US20170202633A1 (en) Imaging and display system for guiding medical interventions
CN107851176A (en) Optical imaging system and its method
CN104116497B (en) Spy optical molecular image-guidance system and multispectral imaging method
US20190110024A1 (en) Tomographic Imaging Methods, Devices, and Systems
WO2013134949A1 (en) Device and method for endoscopic x ray luminescence tomography imaging
CN105640582A (en) Deep tissue X-ray excitation multispectral tomography system and method
CN107270818A (en) It is a kind of to utilize the method for monitoring CCD the real time measures laser probe and plane of illumination spacing walk-off angle degree
CN109752377B (en) Spectroscopic bimodal projection tomography tissue blood vessel imaging device and method
CN111489316B (en) Mammary gland diffusion optical tomography system based on genetic algorithm
CN104323858B (en) Handheld molecular imaging navigation system
CN204072055U (en) Wear-type molecular image navigation system
RU2726257C1 (en) Apparatus for automatic measurement of plant parameters
Liu et al. Detection of heterogeneity in multi-spectral transmission image based on spatial pyramid matching model and deep learning
CN105662354B (en) A kind of wide viewing angle optical molecular tomographic navigation system and method
CN116503258B (en) Super-resolution computing imaging method, device, electronic equipment and storage medium
CN205493762U (en) Spectrum hard tube endoscope system
CN111419194A (en) Fluorescent laser and OCT (optical coherence tomography) -based combined imaging device and method
CN117338249A (en) Excitation fluorescence tomography method
US10422749B2 (en) Facilitating real-time visualization of tissue features derived from optical signals
CN107184181A (en) The processing method and system of Dynamic Fluorescence molecular tomographic
CN108335338B (en) Experimental animal multi-mode fusion imaging system and using method
CN106913304A (en) A kind of spectrum rigid pipe endoscope system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant