CN104305957A - Head-wearing molecule image navigation system - Google Patents
Head-wearing molecule image navigation system Download PDFInfo
- Publication number
- CN104305957A CN104305957A CN201410433156.9A CN201410433156A CN104305957A CN 104305957 A CN104305957 A CN 104305957A CN 201410433156 A CN201410433156 A CN 201410433156A CN 104305957 A CN104305957 A CN 104305957A
- Authority
- CN
- China
- Prior art keywords
- image
- registration
- light source
- module
- infrared fluorescent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 15
- 238000003384 imaging method Methods 0.000 claims abstract description 13
- 230000001678 irradiating effect Effects 0.000 claims abstract description 3
- 238000000034 method Methods 0.000 claims description 22
- 238000005070 sampling Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 10
- 230000004927 fusion Effects 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000000694 effects Effects 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 5
- 239000013307 optical fiber Substances 0.000 claims description 5
- 230000033001 locomotion Effects 0.000 claims description 4
- 238000003672 processing method Methods 0.000 claims description 4
- 230000008878 coupling Effects 0.000 claims description 2
- 238000010168 coupling process Methods 0.000 claims description 2
- 238000005859 coupling reaction Methods 0.000 claims description 2
- 238000006073 displacement reaction Methods 0.000 claims description 2
- 238000003708 edge detection Methods 0.000 claims description 2
- 238000001914 filtration Methods 0.000 claims description 2
- 238000010606 normalization Methods 0.000 claims 1
- 238000002073 fluorescence micrograph Methods 0.000 abstract 2
- 238000005516 engineering process Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 230000005284 excitation Effects 0.000 description 4
- 238000002594 fluoroscopy Methods 0.000 description 4
- 230000011664 signaling Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001727 in vivo Methods 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 241000736800 Vernonia Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 229940121657 clinical drug Drugs 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000036285 pathological change Effects 0.000 description 1
- 231100000915 pathological change Toxicity 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/0035—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/0037—Performing a preliminary scan, e.g. a prescan for identifying a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a head-wearing molecule image navigation system. The head-wearing molecule image navigation system comprises a multispectral light source module, a signal acquisition module, a head-wearing system supporting module and an image processing module, wherein the multispectral light source module is used for irradiating a search region with visible light and near-infrared light; the signal acquisition module is used for acquiring a near-infrared fluorescence image and a visible light image of an imaging object; the head-wearing system supporting module is used for bearing the multispectral light source module and the signal acquisition module to adjust irradiation of the multispectral light source module to the search region; the image processing module is used for conducting image fusing on the collected near-infrared fluorescence image and the collected visible light image and outputting a fused image. According to the embodiment, flexible use of devices in image system application is effectively achieved, and the application space of optical molecule image navigation is expanded.
Description
Technical field
The present invention relates to a kind of imaging system, particularly a kind of wear-type molecular image navigation system.
Background technology
As new method and the means of noinvasive visible technology, the organism physiological molecule level change that the change that molecular image reflects molecular regulation in itself causes and the change of allomeric function.Therefore, the vital movement studying gene, biomacromolecule and cell at body (in vivo) is on a molecular scale a kind of important technology, wherein based on the basic research of the In Vivo bioluminescent imaging technology of molecular engineering, tomography technology, optical image technology, simulation methodology, one of the focus and difficult point of molecular image area research are become.
Traditional medicine image technology combines with modern molecular biology by molecular image equipment, can from cell, molecule aspect observation physiology or pathological change, have hurtless measure, in real time, the advantage such as live body, high specific, high sensitivity and high-resolution video picture.Utilize molecular image technology, greatly can accelerate the development speed of medicine on the one hand, search time before shortening clinical drug; There is provided and diagnose more accurately, make therapeutic scheme mate the gene mapping of patient best, help the medicine of drugmaker's research and development personalized treatment; On the other hand, can apply at biomedical sector, realize the target such as quantitative analysis, image navigation, molecule parting at body.But profit system relative complex in this way, ease of handling and comfort aspect need to be improved further.
Therefore the present invention proposes a kind of wear-type molecular image navigation system, by the multispectral method detection molecules image excited in body target, strengthen application the scope of application.
Summary of the invention
The invention provides a kind of wear-type molecular image navigation system, comprising:
Multispectral light source module, for irradiating visible ray and near infrared light to search coverage;
Signal acquisition module, for gathering near-infrared fluorescent image and the visible images of imaging object;
Head-mounted system supporting module, for carrying described multispectral light source module and described signal acquisition module, to adjust described multispectral light source module to the irradiation of described search coverage;
Image processing module, for carrying out image co-registration to the near infrared light image gathered and visible images, and exports fusion image.
Embodiments of the invention have following technique effect:
1, realizing molecular image navigation, molecular imaging by wearing mode, while practical function, improve convenience.
2, adopt the method for projection imaging that operator can be guided to judge in advance areas imaging, thus add the function of man-machine interaction.
3, utilize the function of speech recognition handled easily personnel can liberate both hands further in the process of the system of use, thus control wear-type molecular image navigation system more accurately.
4, owing to adopting the Eigenvalue Extraction Method of threshold decomposition, signal-to-background ratio is significantly improved, contribute to operator and guide real-time and precise operation according to image.
Accompanying drawing explanation
Fig. 1 is the structural representation of the head-mounted system supporting module according to the embodiment of the present invention;
Fig. 2 is the block diagram of the wear-type molecular image navigation system according to the embodiment of the present invention;
Fig. 3 is the image processing method flow chart of the wear-type molecular image navigation system according to the embodiment of the present invention.
Detailed description of the invention
For making the object, technical solutions and advantages of the present invention clearly understand, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in more detail.
The embodiment of the present invention, based on the fluorescence excitation imaging in molecular image, provides a kind of wear-type molecular image navigation system.
Fig. 1 is the structural representation of the head-mounted system supporting module according to the embodiment of the present invention.Fig. 2 is the block diagram of the wear-type molecular image navigation system according to the embodiment of the present invention.As shown in Figure 2, this wear-type molecular image navigation system can comprise multispectral light source module 110, for providing the light of multiple different spectral coverage, to irradiate detected object; Optical signalling acquisition module 120, for fluorescence excitation image and the visible images of Real-time Collection detected object; Head-mounted system supporting module 130, for adjusting comfortableness when operator wear, and ensures that the safe and effective of imaging carries out; Image processing module 140, for carrying out the process such as Iamge Segmentation, feature extraction, image registration, realizing the fusion of visible images and fluoroscopic image and exporting fusion image.
Next the operation of multispectral light source module 110, optical signalling acquisition module 120, head-mounted system supporting module 130 and image processing module 140 will be described respectively in detail.
Multispectral light source module 110 can comprise cold light source 111, near infrared laser 112 and light source coupler 113.Cold light source 111 is for launching visible ray to detected object.Cold light source 111 can be placed with the first bandpass filter, to be the visible ray of 400-650nm through wavelength.Near infrared laser 113 is configured to the near infrared light that emission center wavelength is such as 785nm.By optical fiber, excitation source can be drawn.It is known to those skilled in the art that the embodiment of the present invention is not limited to above-mentioned implementation, other modes well known in the art can also be adopted to launch visible ray and near infrared light.When exciting search coverage, based on spectral separation method, realize cold light source 111 and the smooth outgoing of near infrared laser 112 by simple optical fiber simultaneously.Particularly, visible light source is coupled at light-emitting window place with the light of near-infrared light source outgoing.In coupling place, light source coupler 113 is set.Light source coupler 113 can be disperse camera lens, and light source is become cone-beam light by rectilinear point light source, can expand irradiated area like this, to realize the uniform irradiation of excitation source to search coverage.Such as, optical lens can be set at the light-emitting window place of near infrared laser 112, optical lens and laser output reverse coupled, realize the output of light source compared with Vernonia parishii Hook. F. angle.The method that machinery is fixing can be adopted, one end of optical fiber and optical lens are fixed together, the other end of optical fiber is connected with head-mounted system supporting module 130.
Optical signalling acquisition module 120 can comprise camera 121, camera lens 122 and coordinate projection device 123.Camera 121 is configured for and gathers near-infrared fluorescent signal and visible light signal.Wherein, in gatherer process, cold light source throws light on to background.Such as, can arrange the reference parameter needed near infrared light signals collecting as follows: at 800nm place, quantum efficiency is higher than 30%, and frame speed is greater than 30fps, and image source (that is, the minimum photosensitive unit point image source of camera 121) size is greater than 5 microns.Preferably, between camera 121 and camera lens 122, place the second bandpass filter, to be the near infrared light of 810-870nm through wavelength.When camera 121 operates, coordinate projection device 123 can project a circular contour to search coverage (not shown), this profile is the maximum magnitude in the visual field, so that operator obtain the search coverage of system, what simultaneously convenient operation personnel obtained multispectral light source module 110 excites scope.
As shown in Figure 1, head-mounted system supporting module 130 can comprise head-mounted system support 131.Head-mounted system support 131 is for carrying light source module 110 and signal acquisition module 120.Preferably, head-mounted system supporting module 130 can also comprise speech recognition and control module 132.Speech recognition controlled module 132 can comprise mike, voice recognition unit and control unit (not shown), to be controlled the operation of the module such as multispectral light source module 110, coordinate projection device 123 by the voice of operator.Speech recognition technology well known in the art can be used to realize speech recognition and control module 132.
The visible images of the detected object from optical signalling acquisition module 120 and near-infrared fluorescent image are input to image processing module 140 respectively.Image processing module 140 is realized by backend computer process, and collection and light source control also can by back-end realization Non-follow control.First image processing module 140 carries out pretreatment to the near-infrared fluorescent image of input, to obtain the characteristic distribution of fluoroscopic image according to fluorescent specific.Pretreatment can comprise noise remove, feature extraction and compensating bad point etc.Certainly, also pretreatment well known in the art can be carried out to visible images.The near-infrared fluorescent image of Threshold segmentation to input can be utilized to carry out feature extraction.Such as, for image intensity value G/ background noise gray value G in near-infrared fluorescent image
nhigher than the pixel of 1.5, the gray value of this pixel is multiplied by 2, for G/G
nlower than the pixel of 1.5, by the gray value of this pixel divided by 2.Characteristic point can be strengthened according to this threshold segmentation method.Gray value is greater than to the area-of-interest of predetermined threshold value, can by gray level image well known in the art to pseudo color image adjustment algorithm, these area-of-interests are changed into pseudo color image, thus mark the position of characteristic point and characteristic area further, so that operator guide implementation and operation according to image.Image after image processing module 140 processes is fusion image, and have display and projection interface on a general-purpose computer, handled easily personnel realize the output display of image.Simultaneously can by visual signal feedbacks in helmet system, visual by what place that mirror forth screen realizes fusion image.
Then, utilize the fluoroscopic image optical characteristics distribution obtained, the visible images that fluorescence inputs is carried out image co-registration, thus obtains fusion results image to export.Particularly, the image co-registration of fluoroscopic image and visible images comprises and utilizes fluoroscopic image optical characteristics to distribute fluoroscopic image and visible images carried out registration.Below by this registration operation of detailed description.
The distribution of fluoroscopic image optical characteristics has fluorescent specific, and visible images is a kind of high resolution structures image.Above-mentioned characteristic is make use of according to the image registration of the embodiment of the present invention.When carrying out registration, can morphology theory be adopted, revising the minimization of energy functional expression of fluoroscopic image optical characteristics distribution, making its shape close to image tissue.Following formula (1) can be used carry out registration.
In formula (1), d is discrete Laplace operator, and U is position vector, selects n surface point as main mark point, p
i, a
ibe respectively imaging surface gauge point, W
i=(p
i-a
i) motion-vector, obtain vectorial U by minimizing E (U)
p, then
for the position after areal deformation.
In order to obtain more accurately, high-resolution fusion image, when carrying out registration, adopt the picture registration degree shown in following formula (2) as registration effect assessment standard.
Wherein, A is visible images Normalized Grey Level value matrix, and B is fluoroscopic image Normalized Grey Level value matrix.Operation result, more close to 1, illustrates that image registration effect is better.
Fig. 3 shows the flow chart of the image processing method according to the embodiment of the present invention.As shown in Figure 3, in step 301, detect through pretreated Detection Method in Optical Image Sequences and fluoroscopy image sequence space motion, so that the unmatched micro-displacement frame of filtering, obtain Detection Method in Optical Image Sequences M1 and fluoroscopy image sequence M2.
Alternatively, in step 303, the High Resolution Visible Light image sequence M1 obtained for step 301 forms image pyramid P1, to reduce data volume, thus improves the real-time of image procossing.Particularly, gaussian pyramid is adopted to carry out down-sampling to image to generate the i-th+1 layer according to i-th layer, pyramid.First carry out convolution by gaussian kernel to i-th layer, then delete all even number lines and even column.Certainly, the image size newly obtained can become 1/4th of upper level image.In this case, first image expands as original twice in each dimension, and newly-increased row (even number line) is filled with 0.Then given filter is used to carry out the approximation that convolution (being actually the filter that is all expanded as twice on every one dimension) goes estimation " loss " pixel.Just whole pyramid can be produced to input picture circulation executable operations by said process.
In step 305, utilize the gradient edge detection method such as adopting Roberts operator, rim detection is carried out to the image pyramid P1 obtained and fluoroscopy image sequence M2, obtains image border E1 and E2 respectively.Certainly, also can skip step 303 when image-capable is higher, directly rim detection be carried out to Detection Method in Optical Image Sequences M1 and fluoroscopy image sequence M2.
In step 307, the sparse sampling based on significance is carried out respectively to image border E1 and E2 obtained.Identical method can be adopted to carry out the sparse sampling based on significance respectively to image border E1 and E2, adopt compressed sensing sparse sampling technology to carry out coefficient sampling to E1 and E2 here, thus obtain sampling output S1 and S2 respectively.
In step 308, S1 and S2 execution registration is exported to the sampling that step 307 obtains.Except adopting above formula (1) and (2) to carry out except registration, point cloud registering can also be used to optimize registration result further.See " Xue Yaohong etc., cloud data registration and surface subdivision technical research, National Defense Industry Press, 2011 ", can repeat no more herein in detail about point cloud registering.
Preferably, step 309 can also be comprised according to image processing method of the present invention.In step 309, Algorithm Convergence checking is carried out to the result of point cloud registering, to ensure the reliable and stable of calculating process.
Preferably, can by image GPU or FPGA of small volume perform step 301,303, the process of 305 and 309, adopt the stronger central processing unit CPU of computing capability to perform step of registration 308 simultaneously, thus further while optimization system performance, the hardware size needed for reduction.
Above-described specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; be understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any amendment made, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.
Claims (12)
1. a wear-type molecular image navigation system, comprising:
Multispectral light source module, for irradiating visible ray and near infrared light to search coverage;
Signal acquisition module, for gathering near-infrared fluorescent image and the visible images of imaging object;
Head-mounted system supporting module, for carrying described multispectral light source module and described signal acquisition module, to adjust described multispectral light source module to the range of exposures of described search coverage;
Image processing module, for carrying out image co-registration to the near infrared light image gathered and visible images, and exports fusion image.
2. system according to claim 1, wherein, described multispectral light source module comprises:
Visible light source, for launching visible ray to detected object;
Near infrared laser, for launching near infrared light to detected object; With
Light source coupler;
Wherein, described light source coupler is coupled described visible ray and near infrared light, and by simple optical fiber, coupling light is connected to described head-mounted system supporting module.
3. system according to claim 2, wherein, described head-mounted system supporting module comprises:
Head-mounted system support, for carrying described multispectral light source module and described signal acquisition module; And
Speech control module, for controlling the operation of multispectral light source module, to form the search coverage of expected range.
4. system according to claim 1, wherein, described image processing module carries out feature extraction to the near-infrared fluorescent image gathered, and comprising:
For image intensity value G/ background noise gray value G
nhigher than the pixel of 1.5, the gray value of described pixel is multiplied by 2; For G/G
nlower than the pixel of 1.5, by the gray value of described pixel divided by 2.
5. system according to claim 4, wherein, described image processing module carries out image co-registration to the near-infrared fluorescent image gathered and visible images, comprises the optical characteristics distribution being obtained described near-infrared fluorescent image by following minimization of energy functional expression:
In formula (1), d is discrete Laplace operator, and U is position vector, selects n surface point as main mark point, p
i, a
ibe respectively imaging surface gauge point, W
i=(p
i-a
i) motion-vector, obtain vectorial U by minimizing
p, then
for the position after areal deformation.
6. system according to claim 1, wherein, described image processing module carries out image co-registration to the near-infrared fluorescent image gathered and visible images, to comprise by the picture registration degree shown in following formula as registration effect assessment standard:
Wherein, A is visible images Normalized Grey Level value matrix, and B is fluoroscopic image Normalized Grey Level value matrix.
7. system according to claim 5, wherein, described image processing module carries out image co-registration to near-infrared fluorescent image and visible images further by point cloud registering.
8. be applied to an image processing method for wear-type molecular image navigation system described in claim 1, comprise:
Space motion detection is carried out to Detection Method in Optical Image Sequences and near-infrared fluorescent image sequence, so that the unmatched micro-displacement frame (301) of filtering;
Down-sampling is carried out to the described Detection Method in Optical Image Sequences detected through space motion, obtains image pyramid (303);
Adopt gradient edge detection method, respectively rim detection is carried out to the image pyramid obtained and near-infrared fluorescent image sequence, obtain image border (305);
Sparse sampling based on significance is carried out respectively to the image border obtained, thus obtains sampling output (307) respectively; And
The sampling obtained is exported and performs registration to carry out image co-registration (308).
9. method according to claim 8, wherein, described registration comprises the optical characteristics distribution being obtained described near-infrared fluorescent image by following minimization of energy functional expression:
In formula (1), d is discrete Laplace operator, and U is position vector, selects n surface point as main mark point, p
i, a
ibe respectively imaging surface gauge point, W
i=(p
i-a
i) motion-vector, obtain vectorial U by minimizing
p, then
for the position after areal deformation.
10. method according to claim 8, wherein, described registration to comprise by the picture registration degree shown in following formula as registration effect assessment standard:
Wherein, A is visible images Normalized Grey Level value matrix, and B is near-infrared fluorescent image normalization gray value matrix.
11. methods according to claim 9, also comprise and use point cloud registering to carry out image registration to near-infrared fluorescent image and visible images further.
12. methods according to claim 11, also comprise and carry out Algorithm Convergence checking to the result of point cloud registering.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410433156.9A CN104305957B (en) | 2014-08-28 | 2014-08-28 | Wear-type molecular image navigation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410433156.9A CN104305957B (en) | 2014-08-28 | 2014-08-28 | Wear-type molecular image navigation system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104305957A true CN104305957A (en) | 2015-01-28 |
CN104305957B CN104305957B (en) | 2016-09-28 |
Family
ID=52361385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410433156.9A Active CN104305957B (en) | 2014-08-28 | 2014-08-28 | Wear-type molecular image navigation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104305957B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105342561A (en) * | 2015-10-09 | 2016-02-24 | 中国科学院自动化研究所 | Wireless voice-operated wearable molecular imaging navigation system |
CN105640481A (en) * | 2015-12-31 | 2016-06-08 | 东莞广州中医药大学中医药数理工程研究院 | Orifice observation device with voice-control light source and voice-control method of orifice observation device |
CN106037674A (en) * | 2016-08-18 | 2016-10-26 | 皖江新兴产业技术发展中心 | Vein imaging system based on hyperspectral imaging |
CN107374730A (en) * | 2017-09-06 | 2017-11-24 | 东北大学 | Optical operation navigation system |
US10026202B2 (en) | 2015-10-09 | 2018-07-17 | Institute Of Automation, Chinese Academy Of Sciences | Wearable molecular imaging navigation system |
CN109662695A (en) * | 2019-01-16 | 2019-04-23 | 北京数字精准医疗科技有限公司 | Fluorescent molecules imaging system, device, method and storage medium |
CN109938700A (en) * | 2019-04-04 | 2019-06-28 | 济南显微智能科技有限公司 | A kind of wear-type IR fluorescence detection device |
CN110226974A (en) * | 2019-07-08 | 2019-09-13 | 中国科学技术大学 | A kind of near-infrared fluorescence imaging system based on augmented reality |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000042910A1 (en) * | 1999-01-26 | 2000-07-27 | Newton Laboratories, Inc. | Autofluorescence imaging system for endoscopy |
US20080251694A1 (en) * | 2007-02-05 | 2008-10-16 | Sanyo Electric Co., Ltd. | Image pickup apparatus |
CN101339653A (en) * | 2008-01-30 | 2009-01-07 | 西安电子科技大学 | Infrared and colorful visual light image fusion method based on color transfer and entropy information |
CN102722556A (en) * | 2012-05-29 | 2012-10-10 | 清华大学 | Model comparison method based on similarity measurement |
CN103489005A (en) * | 2013-09-30 | 2014-01-01 | 河海大学 | High-resolution remote sensing image classifying method based on fusion of multiple classifiers |
CN103530038A (en) * | 2013-10-23 | 2014-01-22 | 叶晨光 | Program control method and device for head-mounted intelligent terminal |
CN203709999U (en) * | 2014-02-07 | 2014-07-16 | 王学庆 | Headwear venipuncture guide dual-light source system device |
CN204072055U (en) * | 2014-08-28 | 2015-01-07 | 中国科学院自动化研究所 | Wear-type molecular image navigation system |
-
2014
- 2014-08-28 CN CN201410433156.9A patent/CN104305957B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000042910A1 (en) * | 1999-01-26 | 2000-07-27 | Newton Laboratories, Inc. | Autofluorescence imaging system for endoscopy |
US20080251694A1 (en) * | 2007-02-05 | 2008-10-16 | Sanyo Electric Co., Ltd. | Image pickup apparatus |
CN101339653A (en) * | 2008-01-30 | 2009-01-07 | 西安电子科技大学 | Infrared and colorful visual light image fusion method based on color transfer and entropy information |
CN102722556A (en) * | 2012-05-29 | 2012-10-10 | 清华大学 | Model comparison method based on similarity measurement |
CN103489005A (en) * | 2013-09-30 | 2014-01-01 | 河海大学 | High-resolution remote sensing image classifying method based on fusion of multiple classifiers |
CN103530038A (en) * | 2013-10-23 | 2014-01-22 | 叶晨光 | Program control method and device for head-mounted intelligent terminal |
CN203709999U (en) * | 2014-02-07 | 2014-07-16 | 王学庆 | Headwear venipuncture guide dual-light source system device |
CN204072055U (en) * | 2014-08-28 | 2015-01-07 | 中国科学院自动化研究所 | Wear-type molecular image navigation system |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105342561A (en) * | 2015-10-09 | 2016-02-24 | 中国科学院自动化研究所 | Wireless voice-operated wearable molecular imaging navigation system |
CN105342561B (en) * | 2015-10-09 | 2017-12-29 | 中国科学院自动化研究所 | The wearable molecular image navigation system of Wireless sound control |
US10026202B2 (en) | 2015-10-09 | 2018-07-17 | Institute Of Automation, Chinese Academy Of Sciences | Wearable molecular imaging navigation system |
CN105640481A (en) * | 2015-12-31 | 2016-06-08 | 东莞广州中医药大学中医药数理工程研究院 | Orifice observation device with voice-control light source and voice-control method of orifice observation device |
CN105640481B (en) * | 2015-12-31 | 2019-05-14 | 东莞广州中医药大学中医药数理工程研究院 | A kind of hole key observation device and its acoustic-controlled method with acoustic control light source |
CN106037674A (en) * | 2016-08-18 | 2016-10-26 | 皖江新兴产业技术发展中心 | Vein imaging system based on hyperspectral imaging |
CN106037674B (en) * | 2016-08-18 | 2018-10-30 | 皖江新兴产业技术发展中心 | A kind of vein imaging system based on high light spectrum image-forming |
CN107374730A (en) * | 2017-09-06 | 2017-11-24 | 东北大学 | Optical operation navigation system |
CN109662695A (en) * | 2019-01-16 | 2019-04-23 | 北京数字精准医疗科技有限公司 | Fluorescent molecules imaging system, device, method and storage medium |
CN109938700A (en) * | 2019-04-04 | 2019-06-28 | 济南显微智能科技有限公司 | A kind of wear-type IR fluorescence detection device |
CN110226974A (en) * | 2019-07-08 | 2019-09-13 | 中国科学技术大学 | A kind of near-infrared fluorescence imaging system based on augmented reality |
Also Published As
Publication number | Publication date |
---|---|
CN104305957B (en) | 2016-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104305957A (en) | Head-wearing molecule image navigation system | |
US11751971B2 (en) | Imaging and display system for guiding medical interventions | |
CN107209362B (en) | Fourier ptychographic tomography | |
CN102488493B (en) | Small animal living body multi-mode molecule imaging system and imaging method | |
Ford et al. | Fast optically sectioned fluorescence HiLo endomicroscopy | |
WO2016011611A1 (en) | Endoscopic optical molecular image navigation system and multi-spectral imaging method | |
WO2015061793A1 (en) | Multipurpose imaging and display system | |
Zhang et al. | UHR-DeepFMT: ultra-high spatial resolution reconstruction of fluorescence molecular tomography based on 3-D fusion dual-sampling deep neural network | |
WO2013134949A1 (en) | Device and method for endoscopic x ray luminescence tomography imaging | |
CN105431091A (en) | Device and method for acquiring fusion image | |
CN107146261B (en) | Bioluminescence tomography quantitative reconstruction method based on magnetic resonance image prior region of interest | |
CN106447703A (en) | Near infrared fluorescence and Cherenkov fluorescence fused imaging method and apparatus | |
Guggenheim et al. | Multi-modal molecular diffuse optical tomography system for small animal imaging | |
EP3942521A1 (en) | Augmented reality headset for medical imaging | |
CN204072055U (en) | Wear-type molecular image navigation system | |
CN114209278B (en) | Deep learning skin disease diagnosis system based on optical coherence tomography | |
Lian et al. | Deblurring sequential ocular images from multi-spectral imaging (MSI) via mutual information | |
CN105662354B (en) | A kind of wide viewing angle optical molecular tomographic navigation system and method | |
Liu et al. | Detection of heterogeneity in multi-spectral transmission image based on spatial pyramid matching model and deep learning | |
Xiao et al. | Spatial resolution improved fluorescence lifetime imaging via deep learning | |
CN102579066B (en) | X-ray coaxial phase-contrast imaging method | |
CN116503258B (en) | Super-resolution computing imaging method, device, electronic equipment and storage medium | |
CN111436903A (en) | Mental image technology chain | |
Costantini et al. | A multimodal imaging and analysis pipeline for creating a cellular census of the human cerebral cortex | |
CN205795645U (en) | A kind of wide viewing angle optical molecular tomographic navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |