CN205094589U - Formula molecule image navigation is dressed in wireless acoustic control - Google Patents

Formula molecule image navigation is dressed in wireless acoustic control Download PDF

Info

Publication number
CN205094589U
CN205094589U CN201520778829.4U CN201520778829U CN205094589U CN 205094589 U CN205094589 U CN 205094589U CN 201520778829 U CN201520778829 U CN 201520778829U CN 205094589 U CN205094589 U CN 205094589U
Authority
CN
China
Prior art keywords
image
light
processing module
module
near infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201520778829.4U
Other languages
Chinese (zh)
Inventor
田捷
何坤山
迟崇巍
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN201520778829.4U priority Critical patent/CN205094589U/en
Application granted granted Critical
Publication of CN205094589U publication Critical patent/CN205094589U/en
Priority to US15/289,535 priority patent/US10026202B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model discloses a formula molecule image navigation is dressed in wireless acoustic control, include: multispectral light source signalling and receiving module for to search coverage transmission visible light and near infrared, and the visible light image and the near infrared fluorescence image of collection formation of image object, image processing module: a visible light image and near infrared image carry out image fusion for being directed at gather to can carry out the three -dimensional reconstruction to multisource information, radio signal processing module: be used for autonomic foundation or connect wireless network signal, wearing formula formation of image module: a receiving and sending for video information acquisition and demonstration, sound to can intelligent perception head gesture change, realize the control to the system. According to the embodiment of the utility model provides a, effectively realized molecule image system use in the intellectuality of equipment, expanded the application space of optics molecule image navigation.

Description

Wireless sound control Wearable molecular image navigation system
Technical field
This utility model relates to a kind of imaging system, particularly a kind of Wireless sound control Wearable molecular image navigation system.
Background technology
That checks order along with human genome completes the arriving with the genome times afterwards comprehensively, and the early stage Precise Diagnosis of disease becomes the Major Strategic Demand of country.The traditional image technology of molecular image technological break-through only to show the limitation of the anatomical structure change caused by cellular elements changes, change traditional ex vivo approach and in the limitation of body Continuous Observation drug mechanism and therapeutic effect, interconnective bridge can not have been erected between molecular biology and clinical medicine.At present, based on the optical molecular image of optical image technology, the information processing technology, molecular biology, chemistry and computational mathematics etc., one of the study hotspot in molecular image field has been become.
Molecular image equipment can realize real-time, dynamic, the imaging in vivo of organism physiology, pathology on cellular and molecular level, have "dead", highly sensitive, measure the advantages such as quick, represent Medical Imaging Technology development new direction.Utilize molecular image technology, accurately can locate heeling-in tumor on the one hand, realize the thorough excision of tumor, alleviate the unnecessary infringement of normal tissue organ; On the other hand, the development of medicine, screening, pre-clinical research time can greatly be shortened.But existing molecule image system relative complex, and troublesome poeration, need to be improved further in actual serviceability and design aspect.
Utility model content
In view of this, the utility model proposes a kind of Wireless sound control Wearable molecular image navigation system, by the use of new project organization and new technique, enhance the accuracy of location and the hommization of operation.
This utility model provides a kind of Wireless sound control Wearable molecular image navigation system, comprising:
Multispectral light source launching and receiving module, for launching the light signal of multispectral section to search coverage, and gathers reflected light signal and the optical signal transmissive of search coverage, and is exported to image processing module;
Image processing module, for carrying out three-dimensional reconstruction and the fusion of search coverage according to the reflected light signal received and optical signal transmissive;
Wireless signal processing module, for providing wireless connections;
Wearable image-forming module, for video information collection and display, the reception of sound and transmission, and the change of perception head pose, realize the intelligentized control method to system.
Wherein, described multispectral light source signal launching and receiving module comprises:
Near infrared laser, launches near infrared light for the detected object to search coverage;
Visible light source, launches visible ray for the detected object to search coverage;
Light source coupler, for described visible ray and the near infrared light of being coupled, and by the light outgoing after coupling to search coverage;
Dichroic light splitting piece, carries out light splitting for the coupling light reflected by described detected object, is divided into reflected light and transillumination;
Near Infrared CCD camera, it, for gathering described reflected light, is sent to image processing module;
Colorful CCD camera, it, for gathering described transillumination, is sent to image processing module.
Wherein, described wireless signal processing module built-in rechargeable battery and wireless network chip, it does not rely on wireless router independently to set up a small wireless network for miscellaneous equipment connection, and it also supports bluetooth to connect and Wi-Fi online, and can connect multiple stage smart machine simultaneously.
Wherein, described Wearable image-forming module comprises: miniscope, photographic head, intelligent processor, acoustic processing module and attitude induction module; Described Wearable image-forming module adopts half frame Glasses structure, with the draw-in groove of adjustable-tightness; Its front suspends described photographic head; Mirror holder place, right side arranges described intelligent processor, and connects described miniscope, and described miniscope plays the video and picture that receive at right eye front certain distance; Described Wearable image-forming module also built-in miniature rechargeable battery, action sensor and headset, described action sensor controls described headset to system, for realizing the interchange between wearer and the Voice command to system according to head pose change.
The such scheme that the utility model proposes has following technique effect:
1, achieved precision navigation, the realtime imaging of molecular image by the structural design of Worn type, and achieve the design requirement of hommization.
2, by the design that Wireless sound control and head pose are responded to, liberated the both hands of operator further, made operation become more convenient, enhance human-computer interaction function.
3, by the switching of various visual angles and the use of three-dimensional reconstruction, make operator obtain better visual angle, more information, enhance function interactive between personnel simultaneously.Further, for other study personnel provide the valuable data of primary scene.
4, by the improvement to some existing algorithms in noise reduction, registration and fusion process, improve the ratio of search coverage signal and background signal, make imaging effect better, navigate more accurate.
Accompanying drawing explanation
Fig. 1 is the structural representation of the Wearable image-forming module according to this utility model embodiment;
Fig. 2 is the integrated morphology schematic diagram according to the multispectral light source signal launching and receiving module of this utility model embodiment, image processing module and wireless signal processing module;
Fig. 3 is the block diagram of the Wireless sound control Wearable molecular image navigation system according to this utility model embodiment;
Fig. 4 is the systematic schematic diagram of the Wireless sound control Wearable molecular image navigation system according to this utility model embodiment.
Detailed description of the invention
For making the purpose of this utility model, technical scheme and advantage clearly understand, below in conjunction with specific embodiment, and with reference to accompanying drawing, this utility model is further described.
This utility model embodiment, based on the fluorescence excitation imaging in molecular image, provides a kind of Wireless sound control Wearable molecular image navigation system.
Fig. 1 is the structural representation of the Wearable imaging device according to this utility model embodiment.Fig. 2 is the integrated morphology schematic diagram according to multispectral light source signal transmitting and receiving device, image processing apparatus and the wireless signal blood processor in this utility model embodiment.Fig. 3 is the block diagram of the Wireless sound control Wearable molecular image navigation system according to this utility model embodiment.As shown in Figure 3, this Wireless sound control Wearable molecular image navigation system comprises:
Multispectral light source launching and receiving module, for providing the light signal of multispectral section, and gathers near-infrared fluorescent signal and the visible light signal of search coverage;
Image processing module, for carrying out the fusion of image, the three-dimensional reconstruction of source signal and the handover management of source signal;
Wireless signal processing module, for independently setting up or connecting wireless network signal;
Wearable image-forming module, for video information collection and display, the reception of sound and transmission, and can change by Intellisense head pose, realize the intelligentized control method to system.
As shown in Figure 2, described Wireless sound control Wearable molecular image navigation system is loaded in mechanical integrated morphology.Described integrated morphology comprises:
Base, movable the first mechanical arm installed on base, movable the second mechanical arm installed on the first mechanical arm other end, and the lens barrel that the second mechanical arm front end is installed.
Next the operation of multispectral light source launching and receiving module, image processing module, wireless signal processing module and Wearable image-forming module will be described respectively in detail.
Described multispectral light source launching and receiving module mainly comprises:
Visible light source, can adopt LED cold light source, and it is positioned on base, launches visible ray for the detected object to search coverage; Alternatively, before visible light source, be placed with a bandpass filter, so that through the visible ray of predetermined wavelength; Described predetermined wavelength is preferably 380-700nm;
Near infrared laser, it is positioned on visible light source, also in the base integrated, launches near infrared light for the detected object to search coverage; Alternatively, its emission center wavelength is the optical signal of near infrared light (such as 800nm);
Light source coupler, it is arranged in the first mechanical arm, and for described visible light signal and near infrared light signal being coupled, the coupling light outgoing after coupling, on the emergent light camera lens in lens barrel, then projects on detected object;
Dichroic light splitting piece, it is arranged on the incident illumination camera lens of lens barrel, for will the coupling light (light that detected object reflects of returning be gathered, reflex to the part light in lens barrel on incident illumination camera lens) carry out light splitting, be divided into reflected light and transillumination, and pass to Near Infrared CCD camera and colorful CCD camera respectively;
Near Infrared CCD camera and colorful CCD camera, it lays respectively at the second both sides, mechanical arm front end (close to lens barrel one end), Near Infrared CCD camera is for receiving the near infrared light after the light splitting of dichroic light splitting piece, and colorful CCD camera is for receiving the visible ray after light splitting.
Wherein, excitation source can be drawn by optical fiber by the laser of described near infrared laser.As shown in Figure 4, the light that visible light source sends and the laser that near infrared laser sends use fiber optic conduction in light source coupler respectively, then on the emergent light camera lens that the light after being coupled by light source coupler uses an optical fiber to project in lens barrel, then search coverage is projected.
It is known to those skilled in the art that in this utility model embodiment and other method well known in the art can also be adopted to launch visible ray and near infrared light.
When exciting search coverage, outgoing while available simple optical fiber realizes visible light signal and near infrared light signal.Particularly, can light source coupler be used with near infrared light signal to be coupled at light-emitting window place visible light signal.
Wherein, described dichroic light splitting piece can be the light splitting piece of 750nm, for carrying out light splitting by gathering the coupling light of returning, being divided into transillumination and reflected light, and passing to Near Infrared CCD camera and colorful CCD camera respectively.Also comprise other optical fiber in this module to be connected with colorful CCD camera with Near Infrared CCD camera respectively, for optical signal is delivered to image processing module.After the light splitting of dichroic light splitting piece, gather the coupling light of returning and be divided into transillumination and reflected light, as shown in Figure 4, transillumination and reflected light have directly passed on Near Infrared CCD camera and colorful CCD camera respectively, then the signal will two cameras obtained, has passed in image processing module through optical fiber respectively;
As shown in Figure 3, image processing module, it is positioned at the middle part of the second mechanical arm, as shown in Figure 4, the optical signal that near infrared light CCD and colorful CCD camera collection are returned, passed to image processing module through optical fiber, image processing module carries CPU, the inside comprises some image processing softwares, can process gathering the optical signal of returning.Image processing module can be connected with wireless processing module, and wireless module can be connected with Wearable image-forming module.Whole system has just been communicated with.Because of the surface-mounted integrated circuit that wireless processing module and image processing module are little, be all positioned in the middle part of the second mechanical arm;
Described image processing module mainly comprises processor, for obtaining reflected light image according to gathering the visible light signal of returning, obtaining transmitted light images, then they are carried out three-dimensional reconstruction and image co-registration according near infrared light signal.
First image processing module carries out pretreatment to transmitted light images and reflected light image, to obtain more accurate image distribution feature.Described pretreatment specifically comprises:
For the picture element matrix of gained reflected light image, for each pixel, get the submatrix window of a pre-sizing centered by current pixel respectively, bubble sort is carried out to the pixel point value in window, get the new value of that value less in middle two values as current pixel point, described pre-sizing is preferably 3 × 4.
For gathering the transmitted light images of returning, use passband is that the Chebyshev type band filter of the first predetermined value hertz carries out filtering, and described first predetermined value is near infrared light wave band, can be 3.33 × 10 14~ 3.8 × 10 14.
Described pretreated object is exactly to reduce noise, strengthens last image quality.In this utility model, different pretreatment is adopted to reflected light image and transmitted light images.Why adopting different pretreatment modes, is because 3.33 × 10 14~ 3.8 × 10 14transillumination fainter, but the light that this utility model needs most, therefore, only intercepts, and does not weaken.And reflected light is bias light, though process weakens a lot like this, noise reduction is fine.
Certainly, also other pretreatment well known in the art can be carried out to reflected light image and transmitted light images.
Afterwards, described image processing module is to having carried out pretreated reflected light image and transmitted light images carries out Wavelet Multi-resolution Transform respectively, the low-frequency information of image and high-frequency information are carried out fusion treatment respectively, to obtain the Multiresolution Decomposition information merged, then obtain fusion image through composition algorithm.
Preferably, when carrying out the image co-registration of wavelet transformation to image, the wavelet coefficient being carried out fusion image by the criterion shown in following formula is selected:
ω i , j = ω i , j 1 a b s ( ω i , j 1 ) ≥ a b s ( ω i , j 2 ) αω i , j 2 o t h e r
Wherein, ω i, jfor the level of fusion image on each yardstick, vertical or diagonal angle wavelet coefficient, represent the wavelet coefficient of reflected light and transillumination corresponding image respectively, α is proportionality coefficient, and 1≤α≤1.05.
Then, the signal that image processing module can collect multiple Wearable image-forming module wearer carries out three-dimensional point cloud splicing and merges, and then obtain the three-dimensional geometric information of detected object, and can the steric information on detected object surface and colour information be returned in system; Three-dimensional point cloud splicing, the signal that multiple Wearable image-forming module wearer can be collected, carried out three-dimensional point cloud splicing before fusion.Image-forming module can also be selected directly to merge gathering the image information of returning, and generating the fusion image of two dimension, or uses three-dimensional point cloud splicing, carrying out three-dimensional splicing and fusion by gathering the image information of returning.Adopt the method to carry out three-dimensional fusion, both can obtain 3-D view after fusion, steric information (being exactly three-dimensional information) and colour information can be obtained again.When characteristic matching, choose the RANSAC that the subset be made up of n (3≤n≤9) individual data point carries out m (m >=200) secondary and operate.
CPU primary responsibility carries out cooperation control, to realize above-mentioned functions better to above operation.In addition, the image information that CPU also obtains any one Wearable image-forming module wearer according to instruction or the three-dimensional image information that reconstruction obtains process, and are sent on miniscope.
As shown in Figure 3, wireless signal processing module mainly comprises Wi-Fi module, self-built wireless signal module and bluetooth module.Self-built wireless signal module can not rely on wireless router independently to set up a small wireless network for miscellaneous equipment connection, to improve robustness and the confidentiality of system.Also be connected with miscellaneous equipment by bluetooth module, or by Wi-Fi module interconnection network.Multiple stage smart machine can be connected simultaneously.
As shown in Figure 3, Wearable image-forming module mainly comprises miniscope, intelligent processor, photographic head, acoustic processing module and head pose induction module etc.This module adopts half frame spectacle structure, with the draw-in groove of adjustable-tightness, can easily change eyeglass.Its front suspension automatic focusing camera lens; Mirror holder place, right side arranges described intelligent processor, and described intelligent processor connects the miniscope that eyeglass front is arranged, and miniscope can play at right eye front certain distance the video and picture that receive.Acoustic processing module uses bone conduction technology, can realize the interchange between wearer and the control to whole Wireless sound control Wearable molecular image navigation system.Head pose induction module can process accordingly to the signal that action sensor senses, realizes the control to system.Intelligent controller can control displaying contents and the display mode of miniscope according to different instructions.Half-frame structure, the framework surrounding eyeglass is not full encirclement.If all enclosed, it is exactly full mount structure.As this, only enclosing the part come and also have picture frame, is exactly half-frame structure.Also have a kind of no-frame glasses.We use half-frame structure, and circle is all eyeglass with lower part, and one is be convenient for changing eyeglass, are secondly to increase the visual field, and better protective spectacles, so as not to for a long time injure by laser-bounce.On upper frame different parts, there are some motion sensors and gyroscope, better to respond to head attitudes vibration.Above modules is all connected with intelligent processor, to realize various function.
Fig. 4 shows the systematic schematic diagram according to this utility model embodiment.As shown in Figure 4, the reflected light image for received by image processing module) and transmitted light images carry out pretreatment as mentioned above, multi-scale geometric analysis can be utilized afterwards to catch the geometry characteristic of image.Then, by calculating variance homogeneity measure (VHW), determine local auto-adaptive window, thus estimate the threshold shrink factor of optimum Contourlet coefficient, atrophy is carried out to Contourlet coefficient, realizes further noise reduction.This further noise reduction process can be selected to carry out according to the requirement of image quality, if require that quality is high, then selects to adopt this further noise-reduction method, if image quality is less demanding, then only carries out preprocessing noise reduction to transmitted light images and reflected light image.For the image after noise reduction, the following SIFT algorithm improved can be used to carry out feature extraction, to obtain good image characteristic point, the quality of image co-registration is improved further.Preferably, in two CCD camera, graphite heat radiation fin can be sticked respectively, greatly can improve image quality.
Above-described specific embodiment; the purpose of this utility model, technical scheme and beneficial effect are further described; be understood that; the foregoing is only specific embodiment of the utility model; be not limited to this utility model; all within spirit of the present utility model and principle, any amendment made, equivalent replacement, improvement etc., all should be included within protection domain of the present utility model.

Claims (4)

1. a Wireless sound control Wearable molecular image navigation system, comprising:
Multispectral light source launching and receiving module, for launching the light signal of multispectral section to search coverage, and gathers reflected light signal and the optical signal transmissive of search coverage, and is exported to image processing module;
Image processing module, for carrying out three-dimensional reconstruction and the fusion of search coverage according to the reflected light signal received and optical signal transmissive;
Wireless signal processing module, for providing wireless connections;
Wearable image-forming module, for video information collection and display, the reception of sound and transmission, and the change of perception head pose, realize the intelligentized control method to system.
2. system according to claim 1, wherein, described multispectral light source signal launching and receiving module comprises:
Near infrared laser, launches near infrared light for the detected object to search coverage;
Visible light source, launches visible ray for the detected object to search coverage;
Light source coupler, for described visible ray and the near infrared light of being coupled, and by the light outgoing after coupling to search coverage;
Dichroic light splitting piece, carries out light splitting for the coupling light reflected by described detected object, is divided into reflected light and transillumination;
Near Infrared CCD camera, it, for gathering described reflected light, is sent to image processing module;
Colorful CCD camera, it, for gathering described transillumination, is sent to image processing module.
3. system according to claim 1, wherein, described wireless signal processing module built-in rechargeable battery and wireless network chip, it does not rely on wireless router independently to set up a small wireless network for miscellaneous equipment connection, it also supports bluetooth to connect and Wi-Fi online, and can connect multiple stage smart machine simultaneously.
4. system according to claim 1, wherein, described Wearable image-forming module comprises: miniscope, photographic head, intelligent processor, acoustic processing module and attitude induction module; Described Wearable image-forming module adopts half frame Glasses structure, with the draw-in groove of adjustable-tightness; Its front suspends described photographic head; Mirror holder place, right side arranges described intelligent processor, and connects described miniscope, and described miniscope plays the video and picture that receive at right eye front certain distance; Described Wearable image-forming module also built-in miniature rechargeable battery, action sensor and headset, described action sensor controls described headset to system, for realizing the interchange between wearer and the Voice command to system according to head pose change.
CN201520778829.4U 2015-10-09 2015-10-09 Formula molecule image navigation is dressed in wireless acoustic control Active CN205094589U (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201520778829.4U CN205094589U (en) 2015-10-09 2015-10-09 Formula molecule image navigation is dressed in wireless acoustic control
US15/289,535 US10026202B2 (en) 2015-10-09 2016-10-10 Wearable molecular imaging navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520778829.4U CN205094589U (en) 2015-10-09 2015-10-09 Formula molecule image navigation is dressed in wireless acoustic control

Publications (1)

Publication Number Publication Date
CN205094589U true CN205094589U (en) 2016-03-23

Family

ID=55510407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520778829.4U Active CN205094589U (en) 2015-10-09 2015-10-09 Formula molecule image navigation is dressed in wireless acoustic control

Country Status (1)

Country Link
CN (1) CN205094589U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105342561A (en) * 2015-10-09 2016-02-24 中国科学院自动化研究所 Wireless voice-operated wearable molecular imaging navigation system
US10026202B2 (en) 2015-10-09 2018-07-17 Institute Of Automation, Chinese Academy Of Sciences Wearable molecular imaging navigation system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105342561A (en) * 2015-10-09 2016-02-24 中国科学院自动化研究所 Wireless voice-operated wearable molecular imaging navigation system
CN105342561B (en) * 2015-10-09 2017-12-29 中国科学院自动化研究所 The wearable molecular image navigation system of Wireless sound control
US10026202B2 (en) 2015-10-09 2018-07-17 Institute Of Automation, Chinese Academy Of Sciences Wearable molecular imaging navigation system

Similar Documents

Publication Publication Date Title
CN105342561B (en) The wearable molecular image navigation system of Wireless sound control
CN106236006B (en) 3D optical molecular image laparoscope imaging systems
CN107510430A (en) Endoscopic optical imaging method and system a kind of while that obtain otherwise visible light color image and blood-stream image
US10026202B2 (en) Wearable molecular imaging navigation system
CN104367380B (en) The visual field switchable double light path molecular image navigation system and formation method
CN107005653A (en) Virtual focusing feeds back
CN104483753A (en) Auto-registration transmission type head-wearing display equipment
CN104116496A (en) Medical three-dimensional venous vessel augmented reality device and method
CN103654699B (en) A kind of formation method of fluorescence excitation binocular endoscope system
CN104434001A (en) Monocular endoscope system based on omnibearing three-dimensional stereovision
CN204318916U (en) The visual field switchable double light path molecular image navigation system
CN205094589U (en) Formula molecule image navigation is dressed in wireless acoustic control
CN206921118U (en) Double-wavelength images acquisition system
CN215937645U (en) Novel mixed reality technique spinal surgery segment location device
JP2017191546A (en) Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display
WO2019100449A1 (en) Imaging fiber based surgical navigation system
CN107976795A (en) Binary channels tissue sample scanner and binary channels tissue sample digital imagery recurrence system
CN110680264A (en) 3D optical endoscope system based on dual-optical-path design
CN211325679U (en) Near-infrared fluorescence imaging system based on augmented reality
CN103767667A (en) Hard multichannel three-dimensional gallbladder endoscope system
CN103767657A (en) Hard multichannel three-dimensional hysteroscope system
CN202891883U (en) Stereoplasm multichannel three dimensional (3D) cystoscope system
CN201814553U (en) Portable fundus camera
CN110226974A (en) A kind of near-infrared fluorescence imaging system based on augmented reality
CN218792184U (en) Monocular 3D stereoscopic endoscope system

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant