CN104814712A - Three-dimensional endoscope and three-dimensional imaging method - Google Patents

Three-dimensional endoscope and three-dimensional imaging method Download PDF

Info

Publication number
CN104814712A
CN104814712A CN201410626297.2A CN201410626297A CN104814712A CN 104814712 A CN104814712 A CN 104814712A CN 201410626297 A CN201410626297 A CN 201410626297A CN 104814712 A CN104814712 A CN 104814712A
Authority
CN
China
Prior art keywords
dimensional
image
destination object
imaging
imaging sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410626297.2A
Other languages
Chinese (zh)
Inventor
耿征
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Three-Dimensional Shi Jia Development In Science And Technology Co Ltd
Original Assignee
Nanjing Three-Dimensional Shi Jia Development In Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Three-Dimensional Shi Jia Development In Science And Technology Co Ltd filed Critical Nanjing Three-Dimensional Shi Jia Development In Science And Technology Co Ltd
Publication of CN104814712A publication Critical patent/CN104814712A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)

Abstract

The invention relates to a three-dimensional endoscope and a three-dimensional imaging method. The three-dimensional endoscope comprises an imaging unit and a control unit. The imaging unit has a housing, an imaging sensor array positioned in the housing and an illuminating device. The imaging sensor array comprises a plurality of imaging sensors used for acquiring two-dimensional images of target objects under illumination provided by an illumination device. The control unit is used for synthesizing three-dimensional images of the target objects based on the two-dimensional images of the target objects acquired by all the imaging sensors. Through adoption of the three-dimensional endoscope and the three-dimensional imaging method of the invention, problems of narrow view fields and rotation view angles widely existing in current laparoscopes can be eliminated so as to obtain a full-field-of-view (FOV) surgery scene which has an appropriate view and which is not shielded.

Description

Three-dimensional endoscope and three-D imaging method
This application claims, application number that submit on November 7th, 2013 is US61901279, the priority of the application that name is called " Intra-Abdominal Lightfield 3D camera and Method of Making theSame ".
Technical field
The present invention relates to a kind of 3 Dimension Image Technique, particularly a kind of three-dimensional endoscope and three-D imaging method.
Background technology
Relative to traditional surgical operation, medical treatment device is inserted in human body by the natural opening of human body or little skin incision by Minimally Invasive Surgery (MIS), carries out Diagnosis and Treat, repairs large-scale medical pathological changes in human body.Over the last couple of decades, progressively instead of the remarkable status of general surgical operation, it can reduce the complication of operation to Minimally Invasive Surgery, accelerates post-operative recovery, improves the satisfaction of patient, reduces postoperative pain etc.
In order to break through Minimally Invasive Surgery technical bottleneck and reduce sickness rate further, single-hole laparoscopic surgery (LESS) technology is developed, and it is the Minimally Invasive Surgery carried out with minimizing abdominal wound quantity by reducing wound size.Gallbladder removal, appendectomy, adrenalectomy, right hemicolectomy, adjustable gastric band placement, partial nephrectomy and radical prostatectomy are used at present.Compared with traditional laparoscopic surgery, single-hole laparoscopic surgery only causes an abdominal wound, is conducive to wound attractive in appearance, simultaneously postoperative misery less, recover fast, adhesion, greatly shorten rehabilitation duration.
Nature aperture endoscopic surgery (NOTES) is another technical development of recent Minimally Invasive Surgery field.It is the natural aperture (mouth, urethra, anus etc.) of endoscope being inserted human body, enters diseased region, thus eliminate abdominal incision/outside cicatrix completely by an internal incision (stomach, vagina, bladder or rectum).Nature aperture endoscopic surgery is for human diagnosis laparoscopic appendectomy, cholecystectomy and sleeve subtotal gastrectomy.
Robot system, as Vinci, da robot system is used to single-hole laparoscopic surgery, is called as robot single-hole laparoscopic surgery, trembles to strengthen the definition of operation, motion convergent-divergent and minimizing.
Although in the past few years, the Minimally Invasive Surgery technology of these three kinds of main flows is developed rapidly, hinders the development of these technology, more patient can not be made to benefit owing to lacking more high performance image device.Action need single wound of these technology enters peritoneal cavity, and this demand can cause a series ofly to be challenged widely, from the risk of collision of instrument, obtains and enough clamps down on power, to weakening of instrument gusseted power.
Particularly the device visualization capability of existing LNR proves there is certain problem and shortage, because doctor no longer directly look at the dissection of patient, but by a two-dimensional video monitor viewing, can not directly see operative site problem by surgical wound.The major defect of these existing imaging devices comprises:
(1) have tunnel vision: the medical apparatus and instruments that the visual field of the abdominal cavity image provided in LNR (abbreviation of LESS, NOTES, R-LESS) is likely entered by same wound block and cover.Image may be overlapping with other instrument, and this just makes doctor be difficult to obtain inner three dimensional depth sense by existing two dimensional image.
(2) wound continue take: in Operations Research, usual laparoscope captures the critical positions of wound always, hinders other apparatuses and operates the synchronism of same wound
(3) apparatus collision: peritoneoscope group continues to take cause operating theater instruments can produce collision inside and outside to wound.
(4) visual angle is limited: the endoscope in LNR, only through a passage, will produce unfamiliar visual angle, and especially occur more in NOTES, and mirror pipe cannot be arbitrarily angled with the conversion of doctor's wish
(5) be difficult to keep correct and stable space orientation: in body, figure is sometimes positioned at four limits, and this just makes doctor be difficult to set up stable space coordinates in meticulous operation process, be difficult to obtain three-dimensional sense.This just considerably increases the work load of doctor, reduces effectiveness and the accuracy of LNR.
(6) lack 3-D view display capabilities and depth perception: the most important thing is, the endoscopic system nowadays used in LNR can only provide two dimensional image, lack three-dimensional depth information.
Summary of the invention
Provide hereinafter about brief overview of the present invention, to provide about the basic comprehension in some of the present invention.Should be appreciated that this general introduction is not summarize about exhaustive of the present invention.It is not that intention determines key of the present invention or pith, and nor is it intended to limit the scope of the present invention.Its object is only provide some concept in simplified form, in this, as the preorder in greater detail discussed after a while.
A main purpose of the present invention is to provide a kind of three-dimensional endoscope and three-D imaging method, can eliminate that existing laparoscope is ubiquitous to have tunnel vision and rotary viewing angle problem, obtain the surgical scene that has appropriate visual angle, unsheltered full filed (FOV).
According to a first aspect of the invention, a kind of three-dimensional endoscope, comprises housing, image-generating unit and control unit;
Described image-generating unit is positioned at described housing;
Described image-generating unit comprises imaging sensor array and luminaire;
Described imaging sensor array comprises multiple imaging sensor, for gathering the two dimensional image of destination object under the illumination that provides at described luminaire;
Described control unit be used for based on each described sensor acquisition to the two dimensional image of described destination object synthesize the 3-D view of described destination object.
According to a second aspect of the invention, a kind of three-D imaging method based on three-dimensional endoscope as above, comprising:
The two dimensional image of destination object is gathered under the illumination that multiple imaging sensors in imaging sensor array provide at luminaire;
Control unit based on each sensor acquisition to the two dimensional image of described destination object synthesize the 3-D view of described destination object.
Adopt three-dimensional endoscope of the present invention and three-D imaging method, can a surgical scene with appropriate visual angle, unsheltered full filed (FOV).
Accompanying drawing explanation
Below with reference to the accompanying drawings illustrate embodiments of the invention, above and other objects, features and advantages of the present invention can be understood more easily.Parts in accompanying drawing are just in order to illustrate principle of the present invention.In the accompanying drawings, same or similar technical characteristic or parts will adopt same or similar Reference numeral to represent.
Fig. 1 is the structure chart of the first embodiment of three-dimensional endoscope of the present invention;
Fig. 2 is the image-forming principle schematic diagram of three-dimensional endoscope of the present invention;
Fig. 3 is the structure chart of the second embodiment of three-dimensional endoscope of the present invention;
Fig. 4 is the three-dimensional distance schematic diagram calculation of the surface point P of the destination object of the three-dimensional endoscope of Fig. 3;
Fig. 5 is the structure chart of a kind of embodiment of structured light projection unit in Fig. 3;
Fig. 6 is the structure chart of the 3rd embodiment of three-dimensional endoscope of the present invention;
Fig. 7 is the three-dimensional distance schematic diagram calculation of the surface point P of the destination object of the three-dimensional endoscope of the 3rd embodiment of three-dimensional endoscope of the present invention;
Fig. 8 is the structure chart of the 5th embodiment of three-dimensional endoscope of the present invention;
Fig. 9 is the structure chart of the 6th embodiment of three-dimensional endoscope of the present invention;
Figure 10 is the flow chart of a kind of embodiment of three-D imaging method of the present invention.
Detailed description of the invention
With reference to the accompanying drawings embodiments of the invention are described.The element described in an accompanying drawing of the present invention or a kind of embodiment and feature can combine with the element shown in one or more other accompanying drawing or embodiment and feature.It should be noted that for purposes of clarity, accompanying drawing and eliminate expression and the description of unrelated to the invention, parts known to persons of ordinary skill in the art and process in illustrating.
First embodiment
Shown in Figure 1, be the structure chart of the first embodiment of three-dimensional endoscope of the present invention.
In the present embodiment, three-dimensional endoscope comprises image-generating unit 100 and control unit 105.
Image-generating unit 100 comprises housing 103, and is positioned at imaging sensor array and the luminaire 101 of housing 103.Imaging sensor array comprises multiple imaging sensor 102, for gathering the two dimensional image of destination object 108 under the illumination that provides at luminaire.
Control unit 105 synthesizes the 3-D view of destination object for the two dimensional image of the destination object 108 collected based on each imaging sensor 102.
In use, image-generating unit 100 can be placed in the body interior (such as intraperitoneal) of patient, and control unit 105 can be placed in outside patient body.
Shown in Figure 2, be the image-forming principle schematic diagram of three-dimensional endoscope of the present invention.
The three-dimensional information (that is, all light information that can see) that destination object 108 is complete can be described by light field.In calculating light field theory, light field can be expressed by the two dimensional image of a series of different visual angles usually.Contained one group of light enriched by the image that image sensor array 102 is taken, namely these light are the part light fields produced by destination object 108.In fig. 2, light field is the stacking expression of multiple two dimensional images obtained by light field three-dimensional endoscope.Light field provides the two and three dimensions image of full resolution, is convenient to visual three dimensional display of three-dimensional surface rebuilding, three-dimensional measurement and free view-point etc.By processing the light of catching, three-dimensional surface rebuilding can be completed, play up and generating three-dimensional figures picture.
As a kind of embodiment, imaging sensor 102 can comprise charge-coupled image sensor (CCD) or complementary metal oxide semiconductors (CMOS) sensor (CMOS).Can being used with the CCD/CMOS sensor assembly of digital version of simulation.Such as, can select the CMOS chip of OmniVision company, this chip has the image resolution ratio of 672x492 pixel, image area 4.032 millimeters of x2.952 millimeters, Pixel Dimensions 6x6 μm.Imaging sensor 102 can use the minisize optical lens of high-quality to obtain suitable visual angle, field (FOV) (such as 120 degree of visual fields).
In the present embodiment, the geometric position of all imaging sensors 102 can be arbitrary, but should be known or obtain by collimation technique.Such as, each imaging sensor 102 in sensor array can linearly arrange.
Imaging sensor 102 in sensor array can be identical, also can have different optics, machinery and/or characteristic electron.Such as, these sensors can have different focal lengths, the visual field, spectral region, pixel resolution or any other performance indications.Can obtain from these sensors regardless of image or non-image signal.
As a kind of embodiment, luminaire 101 can adopt LED, and other also can be used can to provide the mode (as optical fiber) of suitably illumination.Such as, the mini LED that Nichia company can be adopted to produce.The brightness of this LED is controlled.
Traditional two-dimentional peritoneoscope and/or endoscope only provide two dimensional image, do not have three dimensional depth clue.And conventional stereo endoscope is such as used in Vinci, da robot, two target scene images that visual angle can only be provided slightly different.The shortcoming of traditional stereo endoscope comprises:
(1) stereo-picture only wears special glasses, or just can see at a specially designed observation platform separated with surgeon and operating room environment completely;
(2) block can affect accurate three-dimensional reconstruction and measure scene;
(3) observer not movable sensor freely can not change the visual angle of target, this LNR operation during this is difficult to accomplish;
(4) owing to can not obtain the view of sufficient amount, traditional stereo endoscope is unfavorable for giant-screen, looks up, not wearing spectacles (automatic stereo) and interactively three dimensional display.
Owing to having multiple high-resolution imaging sensor, three-dimensional endoscope of the present invention overcomes the above-mentioned shortcoming of traditional stereo endoscope.
In one embodiment, three-dimensional endoscope can also comprise flexible cable 104.
Flexible cable 104 is connected between control unit 105 and image-generating unit 100, for providing electric power to supply for image-generating unit 100, and by imaging sensor array acquisition to multiple two dimensional images transfer to control unit 105.
Owing to have employed flexible cable 104, eliminate the hard axle of conventional laparoscopic and endoscope, for other operating theater instruments stretching into wound saves valuable space, avoid apparatus collision.In addition, image-generating unit 100 can be placed on peritoneal cavity Anywhere, need not by the constraint of any arbor.Usually, can light field three-dimensional endoscope 100 be placed near operative site, go to obtain the wider image in the visual field, even if away from wound, still can avoid unsighted in pipeline and angle of strabismus.
Second embodiment
Shown in Figure 3, be the three-dimensional endoscope of second embodiment of the invention.
In the present embodiment, luminaire 101 comprises structured light projection unit 110.
Structured light projection unit 110 is for the Surface Creation structuring texture at destination object.Each imaging sensor 102 in imaging sensor array is for gathering the two dimensional image of structuring texture and transferring to control unit 105.Control unit 105 carries out three-dimensional reconstruction based on the two dimensional image of multiple structuring texture to destination object.
In figure 3, structured light projection unit 110 produces the structuring texture 111 of spatial variations on the surface of target 108.Structured light is a well-known three-dimensional surface imaging technique.In the present invention, we are applied to structured illumination technology in three-dimensional endoscope.
By the structuring texture 111 projected out by structured light projection unit 110, we are easy to the surface character distinguishing destination object.Based on the three-dimensional reconstruction of multi views, reliable three-dimensional surface rebuilding can be carried out.Such calculating does not need the geometric position/direction of calibration structure light projection unit 110.The surface topography be projected can strengthen the surface character of target, thus improves the q&r of three-dimensional reconstruction result.
Three-dimensional surface rebuilding also can use and carry out from the structured light projection of calibrated projector.In this case, the geological information (position/orientation) of structured light projection can draw according to calibration information.Fig. 4 shows this system example only having an imageing sensor, without loss of generality.This principle can expand to multiple imaging sensor and/or multiple structure light projection system.Imaging sensor, geometrical relationship between structured light projection unit and the point of body surface can by the expression of principle of triangulation:
R = B sin β sin ( α + β )
Key based on the triangulation of 3 Dimension Image Technique is a kind of technology of the hot spot being used for distinguishing single projection from the image taken under two-dimensional projection's pattern.Structural light stripes light illumination mode provides a kind of simple mechanism to perform corresponding relation.Known base line B and two angle [alpha] and β, the three-dimensional distance R of the surface point P of a destination object can accurately calculate.Wherein, baseline B is the distance between structured light projection unit 110 and lighting unit 102 photocentre.α be destination object surface point P to lighting unit 102 photocentre between line and baseline between angle.β be then destination object surface point P to structured light projection unit 110 photocentre between line and baseline between angle.
Structured light projection unit 110 can be designed to various forms.Fig. 5 shows a kind of typical embodiment.
Shown in Figure 5, in the present embodiment, structured light projection unit comprises light source 201, mode screen 202 and object lens 203.Wherein light source 201 and object lens 203 are positioned at the both sides of mode screen 202.Light source 201 is for providing illumination for mode screen 202.Mode screen 202 has predetermined pattern.Object lens 203 for light source 201 is sent, the surface that projects destination object through the light of mode screen 202, make the Surface Creation structuring texture of destination object.
Light source 201 can be that an incoherent light source is as LED or optical fiber lighting device.Structure based light principle design (single-shot) in the pattern of structured light patterns 202.Object lens are the multi-lens optical systems that can generate high quality graphics projection.
Light source 201 also can be relevant, such as laser.Mode screen 202 can be diffraction optical element (DOE), it is designed with certain diffraction pattern.This diffraction pattern can as Structured Illumination pattern.An energy can be used from the miniature diffraction optical element (DOE) of light sources transmit light, a GRIN collimating lens and a single-mode fiber to form structured light projection unit.Projection mode provides unique labelling at target surface.Then, three-dimensional surface profile can be obtained (as shown in Figure 4) by application triangulation.
3rd embodiment
Shown in Figure 6, in the present embodiment, imaging sensor array comprises the spectral characteristic multiple imaging sensors (302 ~ 305) different with polarization characteristic.
To multiple imaging sensors of light field three-dimensional endoscope, some sensors can be configured and obtain image at different-waveband and different polarization direction.Such as, some or multiple imaging sensor is made only to catch light in certain spectral region by increasing narrow band filter, thus Enhanced Imaging contrast (signal to noise ratio).Polarization imaging collection can suppress surface reflection on the impact of image quality.
Light spectrum image-forming and polarization imaging are completely independently imaging modes.The two can according to specific application demand simultaneously or be used alone.
Imaging sensor array 301 shown in Fig. 6 has eight optical channels, respectively has spectrum and the polarization characteristic of its uniqueness.They are used to the multispectral combination picture obtaining target object surface and surface, Asia, thus the three-dimensional surface profile of reconstructed object object.
4th embodiment
Shown in Figure 7, in the present embodiment, imaging sensor array comprises two imaging sensors.
These two imaging sensors can lay respectively at the two ends of housing, for gathering the two dimensional image of destination object respectively with visual angle, left side and visual angle, right side.
In the present embodiment, utilize a pair imaging sensor to go to simulate the image that the anthropoid binocular vision of class removes to obtain destination object, thus obtain the three-dimensional information of target object surface.Matching algorithm can realize the exact matching of the identical surface point P of two images.Geometrical relationship between two imaging sensors and body surface point P can by the expression of principle of triangulation:
R = B sin β sin ( α + β )
Wherein baseline B is the connecting line between two imaging sensor photocentres, R is connecting line between one of them imageing sensor photocentre and surface point P of destination object, α is the angle between R, B, and β is then the angle between another sensor photocentre and P between connecting line and B.The coordinate figure (x, y, z) of P point can draw according to R, β, α accurate calculation.
5th embodiment
Shown in Figure 8, be the structure chart of the 5th embodiment of three-dimensional endoscope of the present invention.
In the present embodiment, three-dimensional endoscope comprises the first wireless communication link module 307 and the second wireless communication link module (not shown).
First wireless communication link module 307 is placed in image-generating unit 300, and the second wireless communication link module is placed in control unit 305.
First wireless communication link module 307 is for transferring to the second wireless communication link module by multiple two dimensional images of imaging sensor array acquisition.
The set of cells 304 for powering for image-generating unit 300 is also comprised in image-generating unit 300.Set of cells 304 can be the minicell of any type, as lithium battery, as long as the capacity of battery enough maintains the normal of three-dimensional endoscope to run.First wireless communication link module 307 and the second wireless communication link module can complete the transmission of application high speed multichannel image data.
Present embodiment, without the need to any communication and supply line, can remove connection cord, further facilitates the use of various apparatus in clinical operation process.
6th embodiment
Shown in Figure 9, be the structure chart of the 6th embodiment of three-dimensional endoscope of the present invention.
In the present embodiment, three-dimensional endoscope also comprises magnetic guide apparatus 401 and magnetic guidance controller 400;
Magnetic guide apparatus 401 is arranged on image-generating unit 100, under the control of magnetic guidance controller, drives image-generating unit translation and/or rotation.
In clinical operation, magnetic guide apparatus 401 and image-generating unit 100 can be placed in the intraperitoneal of patient, and magnetic guidance controller 400 can be placed in outside abdominal cavity, and magnetic guide apparatus 401 and image-generating unit 100 are fixed on peritoneum inwall by magnetic force.
As a kind of embodiment, magnetic guidance controller 400 comprises pair of magnets 402, for producing magnetic-adsorption to magnetic guide apparatus 401.This magnet 402 can produce enough magnetic force, and dragging image-generating unit 100 and magnetic guide apparatus 401 are to the position of specifying and direction.
Preferably, magnetic guidance controller 400 can also comprise shaft rotating device 403.Shaft rotating device 403 can be arranged on magnet 402.Operator can manual (or electronics) axial-rotation of magnet 402 of controlling.The rotation of magnet 402 makes magnetic direction change, thus drives the rotation of magnetic guide apparatus 401, and then makes image-generating unit 100 produce rotary motion.
Preferably, magnetic guidance controller 400 can also comprise handle 404, thus the operation magnetic guidance controller 400 of safe ready more.
In addition, the three-dimensional endoscope in the above first to the 6th embodiment can also comprise display device.
Display device is connected with control unit, for the 3-D view of the destination object that indicative control unit generates.
Three-D imaging method
Shown in Figure 10, be the flow chart of a kind of embodiment of the three-D imaging method based on three-dimensional endoscope as above.
In the present embodiment, three-D imaging method comprises:
S10: the two dimensional image gathering destination object under the illumination that the multiple imaging sensors in imaging sensor array provide at luminaire;
S20: the two dimensional image of the destination object that control unit collects based on each imaging sensor synthesizes the 3-D view of destination object.
Optionally, three-D imaging method can also comprise:
S30: being connected to flexible cable between control unit and image-generating unit for image-generating unit provides electric power to supply, and by imaging sensor array acquisition to multiple two dimensional images transfer to control unit.
In one embodiment, S10 can specifically comprise:
S11: the structured light projection unit in luminaire is managed at the Surface Creation structuring stricture of vagina of destination object;
S12: each imaging sensor in imaging sensor array gathers the two dimensional image of structuring texture and transfers to control unit.
S20 can specifically comprise:
S21: control unit carries out three-dimensional reconstruction based on the two dimensional image of multiple structuring texture to destination object.
In one embodiment, S11 can specifically comprise:
S111: light source provides illumination for mode screen;
S112: mode screen has predetermined pattern;
S113: that light source sends by object lens, project destination object through the light of mode screen surface, makes the Surface Creation structuring texture of destination object.
In another embodiment, S10 can also specifically comprise:
S12: the multiple imaging sensors in imaging sensor array gather the two dimensional image of destination object with different spectral characteristics and polarization characteristic.
In another embodiment, S10 can also specifically comprise:
S13: imaging sensor array comprises two imaging sensors; Two imaging sensors in imaging sensor array gather the two dimensional image of destination object respectively with visual angle, left side and visual angle, right side.
Optionally, three-D imaging method can also comprise:
S40: multiple two dimensional images of imaging sensor array acquisition are transferred to the second wireless communication link module be placed in control unit by the first wireless communication link module be placed in image-generating unit.
Optionally, three-D imaging method can also comprise:
S50: be arranged on magnetic guide apparatus on image-generating unit housing under the control of magnetic guidance controller, drives imaging sensor array in housing and housing and luminaire translation and/or rotation.
In one embodiment, three-D imaging method can also comprise:
S60: the 3-D view of the destination object that the display device indicative control unit be connected with control unit generates.
Adopt three-dimensional endoscope of the present invention and three-D imaging method, have the following advantages:
(1) solve that existing laparoscope is ubiquitous to have tunnel vision and rotary viewing angle problem; , thus acquisition one has appropriate visual angle, unsheltered full filed surgical scene;
(2) avoid surgical wound continue take;
(3) three-dimensional endoscope of the present invention can be placed near operative site and to carry out correct space orientation, by its three-dimensional imaging and disposal ability, can provide the real time imaging possessing correct direction and appropriate visual angle for surgeon;
(4) provide three dimensional depth clue: three-dimensional endoscope and three-D imaging method can provide the real-time three dimensional depth figure possessing high-resolution texture information, therefore the 3D vision feedback of enhancing can be provided for operation, location and hands art for surgeon;
(5) surgical target dimensional measurement: by means of the three-dimensional imaging ability of its uniqueness, can provide the quantitative three-dimensional measurement of object in surgical scene;
(6) carry out the intervention (IGI) of image guiding: the 3-D view of generation can represent between the three dimensional surface data easily and accurately in the preoperative in CT/MRI data and body, thus carry out image and guide and get involved.
(7) its 3-D view generated can make surgeon without the need to wearing any special glasses and identifiable design.
Above some embodiments of the present invention are described in detail.As one of ordinary skill in the art can be understood, whole or any step of method and apparatus of the present invention or parts, can in the network of any computing equipment (comprising processor, storage medium etc.) or computing equipment, realized with hardware, firmware, software or their combination, this is that those of ordinary skill in the art use their basic programming skill just can realize when understanding content of the present invention, therefore need not illustrate at this.
In addition, it is evident that, when relating to possible peripheral operation in superincumbent explanation, any display device and any input equipment, corresponding interface and control sequence that are connected to any computing equipment will be used undoubtedly.Generally speaking, related hardware in computer, computer system or computer network, software and realize the hardware of the various operations in preceding method of the present invention, firmware, software or their combination, namely form equipment of the present invention and each building block thereof.
Therefore, based on above-mentioned understanding, object of the present invention can also be realized by an operation program or batch processing on any messaging device.Described messaging device can be known common apparatus.Therefore, object of the present invention also can realize only by the program product of providing package containing the program code realizing described method or equipment.That is, such program product also forms the present invention, and stores or the medium that transmits such program product also forms the present invention.Obviously, described storage or transmission medium can be well known by persons skilled in the art, or the storage of any type developed in the future or transmission medium, therefore also there is no need to enumerate various storage or transmission medium at this.
In equipment of the present invention and method, obviously, each parts or each step reconfigure after can decomposing, combine and/or decomposing.These decompose and/or reconfigure and should be considered as equivalents of the present invention.Also it is pointed out that the step performing above-mentioned series of processes can order naturally following the instructions perform in chronological order, but do not need necessarily to perform according to time sequencing.Some step can walk abreast or perform independently of one another.Simultaneously, above in the description of the specific embodiment of the invention, the feature described for a kind of embodiment and/or illustrate can use in one or more other embodiment in same or similar mode, combined with the feature in other embodiment, or substitute the feature in other embodiment.
Should emphasize, term " comprises/comprises " existence referring to feature, key element, step or assembly when using herein, but does not get rid of the existence or additional of one or more further feature, key element, step or assembly.
Although described the present invention and advantage thereof in detail, be to be understood that and can have carried out various change when not exceeding the spirit and scope of the present invention limited by appended claim, substituting and conversion.And the scope of the application is not limited only to the specific embodiment of process, equipment, means, method and step described by description.One of ordinary skilled in the art will readily appreciate that from disclosure of the present invention, can use perform the function substantially identical with corresponding embodiment described herein or obtain and its substantially identical result, existing and that will be developed in the future process, equipment, means, method or step according to the present invention.Therefore, appended claim is intended to comprise such process, equipment, means, method or step in their scope.

Claims (18)

1. a three-dimensional endoscope, comprises image-generating unit and control unit; It is characterized in that:
Described image-generating unit comprises housing, and is positioned at imaging sensor array and the luminaire of housing;
Described imaging sensor array comprises multiple imaging sensor, for gathering the two dimensional image of destination object under the illumination that provides at described luminaire;
The two dimensional image that described control unit is used for the described destination object collected based on each described imaging sensor synthesizes the 3-D view of described destination object.
2. three-dimensional endoscope according to claim 1, is characterized in that, also comprises flexible cable;
Described flexible cable is connected between described control unit and described image-generating unit, for providing electric power to supply for described image-generating unit, and by described imaging sensor array acquisition to multiple two dimensional images transfer to described control unit.
3. three-dimensional endoscope according to claim 1, is characterized in that:
Described luminaire comprises structured light projection unit;
Described structured light projection unit is used for managing at the Surface Creation structuring stricture of vagina of described destination object;
Each imaging sensor in described imaging sensor array is for gathering the two dimensional image of described structuring texture and transferring to described control unit;
Described control unit is used for carrying out three-dimensional reconstruction based on the two dimensional image of multiple described structuring texture to described destination object.
4. three-dimensional endoscope according to claim 3, is characterized in that:
Described structured light projection unit comprises light source, mode screen and object lens, and described light source and described object lens are positioned at the both sides of described mode screen;
Described light source is used for providing illumination for described mode screen;
Described mode screen has predetermined pattern;
Described object lens are used for that sent by described light source, project described destination object through the light of described mode screen surface, structuring texture described in the Surface Creation making described destination object.
5. three-dimensional endoscope according to claim 1, is characterized in that:
Described imaging sensor array comprises the spectral characteristic multiple imaging sensors different with polarization characteristic.
6. three-dimensional endoscope according to claim 1, is characterized in that:
Described imaging sensor array comprises two imaging sensors;
Described imaging sensor lays respectively at the two ends of described housing, for gathering the two dimensional image of described destination object respectively with visual angle, left side and visual angle, right side.
7. three-dimensional endoscope according to claim 1, is characterized in that, comprises the first wireless communication link module and the second wireless communication link module;
Described first wireless communication link module is placed in described image-generating unit, and described second wireless communication link module is placed in described control unit;
Described first wireless communication link module is used for multiple two dimensional images of described imaging sensor array acquisition to transfer to described second wireless communication link module;
Described image-generating unit also comprises the set of cells for powering for described image-generating unit.
8. three-dimensional endoscope according to claim 1, is characterized in that, also comprises magnetic guide apparatus and magnetic guidance controller;
Described magnetic guide apparatus is arranged on described image-generating unit, under the control of described magnetic guidance controller, drives described image-generating unit translation and/or rotates arbitrarily.
9. the three-dimensional endoscope according to claim 1-8 any one, is characterized in that, also comprises display device;
Described display device is connected with described control unit, for showing the 3-D view of the described destination object that described control unit generates.
10. a three-D imaging method, is characterized in that, comprising:
The two dimensional image of destination object is gathered under the illumination that multiple imaging sensors in imaging sensor array provide at luminaire;
The two dimensional image of the described destination object that control unit collects based on each imaging sensor synthesizes the 3-D view of described destination object.
11. three-D imaging methods according to claim 10, is characterized in that, also comprise:
Being connected to flexible cable between control unit and image-generating unit for image-generating unit provides electric power to supply, and by imaging sensor array acquisition to multiple two dimensional images transfer to control unit.
12. three-D imaging methods according to claim 10, is characterized in that,
Described " gathering the two dimensional image of destination object under the illumination that the multiple imaging sensors in imaging sensor array provide at luminaire " specifically comprises:
Structured light projection unit in luminaire is managed at the Surface Creation structuring stricture of vagina of described destination object;
Each imaging sensor in imaging sensor array gathers the two dimensional image of described structuring texture and transfers to control unit;
Described " control unit based on each sensor acquisition to the two dimensional image of described destination object synthesize the 3-D view of described destination object " specifically comprise:
Control unit carries out three-dimensional reconstruction based on the two dimensional image of multiple described structuring texture to described destination object.
13. three-D imaging methods according to claim 12, is characterized in that, described " the structured light projection unit in luminaire is at the Surface Creation structuring texture of described destination object " specifically comprises:
Light source provides illumination for mode screen;
Mode screen has predetermined pattern;
Surface that light source sends by object lens, project described destination object through the light of mode screen, structuring texture described in the Surface Creation making described destination object.
14. three-D imaging methods according to claim 10, is characterized in that, described, and " gathering the two dimensional image of destination object under the illumination that the multiple imaging sensors in imaging sensor array provide at luminaire " specifically comprises:
Multiple imaging sensors in described imaging sensor array gather the two dimensional image of destination object with different spectral characteristics and polarization characteristic.
15. three-D imaging methods according to claim 10, is characterized in that, described, and " gathering the two dimensional image of destination object under the illumination that the multiple imaging sensors in imaging sensor array provide at luminaire " specifically comprises:
Described imaging sensor array comprises two imaging sensors;
Two imaging sensors in imaging sensor array gather the two dimensional image of described destination object respectively with visual angle, left side and visual angle, right side.
16. three-D imaging methods according to claim 10, is characterized in that, also comprise:
Multiple two dimensional images of imaging sensor array acquisition are transferred to the second wireless communication link module be placed in control unit by the first wireless communication link module be placed in image-generating unit.
17. three-D imaging methods according to claim 10, is characterized in that, also comprise;
Be arranged on magnetic guide apparatus on image-generating unit under the control of magnetic guidance controller, drive shell image-generating unit translation and/or rotation.
18. three-D imaging methods according to claim 10-17 any one, is characterized in that, also comprise:
The 3-D view of the described destination object that the display device indicative control unit be connected with control unit generates.
CN201410626297.2A 2013-11-07 2014-11-07 Three-dimensional endoscope and three-dimensional imaging method Pending CN104814712A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361901279P 2013-11-07 2013-11-07
US61/901,279 2013-11-07

Publications (1)

Publication Number Publication Date
CN104814712A true CN104814712A (en) 2015-08-05

Family

ID=53725390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410626297.2A Pending CN104814712A (en) 2013-11-07 2014-11-07 Three-dimensional endoscope and three-dimensional imaging method

Country Status (1)

Country Link
CN (1) CN104814712A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107317954A (en) * 2016-04-26 2017-11-03 深圳英伦科技股份有限公司 Capsule microscopy is peeped in 3D and surveys method and system
CN108197560A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Facial image recognition method, mobile terminal and computer readable storage medium
CN108700932A (en) * 2015-09-24 2018-10-23 托比股份公司 It can carry out the wearable device of eye tracks
CN109219834A (en) * 2016-06-02 2019-01-15 威里利生命科学有限责任公司 The system and method for 3D scene rebuilding for being illuminated with antithesis complementary patterns
CN110891471A (en) * 2018-03-21 2020-03-17 卡普索影像公司 Endoscope providing physiological characteristic dimension measurement using structured light
CN111222620A (en) * 2020-01-07 2020-06-02 深圳毅能达金融信息股份有限公司 Intelligent card made of wood and production process thereof
CN113693543A (en) * 2019-06-17 2021-11-26 深圳硅基智控科技有限公司 Capsule endoscope system with positioning function
CN113854968A (en) * 2021-10-28 2021-12-31 浙江智柔科技有限公司 Cervical opening size monitoring device, method and application thereof
CN117204796A (en) * 2023-11-09 2023-12-12 哈尔滨海鸿基业科技发展有限公司 Multispectral imaging method and device of abdominal cavity endoscope

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6063023A (en) * 1997-03-12 2000-05-16 Olympus Optical Co., Ltd. Measuring endoscope system
CN1627324A (en) * 2003-12-08 2005-06-15 西门子公司 Method of fusing image display
CN201189161Y (en) * 2008-05-06 2009-02-04 中国人民解放军第三军医大学第二附属医院 Electric endoscope navigation system
US20100104156A1 (en) * 2007-07-26 2010-04-29 Olympus Medical Systems Corp. Medical image processing apparatus and medical image processing method
JP2011055935A (en) * 2009-09-08 2011-03-24 Hoya Corp Endoscope apparatus
CN102283626A (en) * 2010-05-21 2011-12-21 哈尔滨工业大学 Medical endoscope containing structured light three-dimensional imaging system
CN202648631U (en) * 2012-06-28 2013-01-02 耿征 Structured-light generating device and minitype three-dimensional imaging device
WO2013008097A1 (en) * 2011-07-08 2013-01-17 Duret Francois Three-dimensional measuring device used in the dental field
CN203219432U (en) * 2013-03-12 2013-09-25 耿征 True three-dimensional (3D) image acquisition device and display system
JP2013192773A (en) * 2012-03-21 2013-09-30 Olympus Corp Video system for surgery and video display method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6063023A (en) * 1997-03-12 2000-05-16 Olympus Optical Co., Ltd. Measuring endoscope system
CN1627324A (en) * 2003-12-08 2005-06-15 西门子公司 Method of fusing image display
US20100104156A1 (en) * 2007-07-26 2010-04-29 Olympus Medical Systems Corp. Medical image processing apparatus and medical image processing method
CN201189161Y (en) * 2008-05-06 2009-02-04 中国人民解放军第三军医大学第二附属医院 Electric endoscope navigation system
JP2011055935A (en) * 2009-09-08 2011-03-24 Hoya Corp Endoscope apparatus
CN102283626A (en) * 2010-05-21 2011-12-21 哈尔滨工业大学 Medical endoscope containing structured light three-dimensional imaging system
WO2013008097A1 (en) * 2011-07-08 2013-01-17 Duret Francois Three-dimensional measuring device used in the dental field
JP2013192773A (en) * 2012-03-21 2013-09-30 Olympus Corp Video system for surgery and video display method
CN202648631U (en) * 2012-06-28 2013-01-02 耿征 Structured-light generating device and minitype three-dimensional imaging device
CN203219432U (en) * 2013-03-12 2013-09-25 耿征 True three-dimensional (3D) image acquisition device and display system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108700932A (en) * 2015-09-24 2018-10-23 托比股份公司 It can carry out the wearable device of eye tracks
CN108700932B (en) * 2015-09-24 2021-05-28 托比股份公司 Wearable device capable of eye tracking
CN107317954A (en) * 2016-04-26 2017-11-03 深圳英伦科技股份有限公司 Capsule microscopy is peeped in 3D and surveys method and system
CN109219834A (en) * 2016-06-02 2019-01-15 威里利生命科学有限责任公司 The system and method for 3D scene rebuilding for being illuminated with antithesis complementary patterns
CN108197560A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Facial image recognition method, mobile terminal and computer readable storage medium
CN110891471A (en) * 2018-03-21 2020-03-17 卡普索影像公司 Endoscope providing physiological characteristic dimension measurement using structured light
CN113693543A (en) * 2019-06-17 2021-11-26 深圳硅基智控科技有限公司 Capsule endoscope system with positioning function
CN113693543B (en) * 2019-06-17 2023-06-02 深圳硅基智控科技有限公司 Capsule endoscope system with positioning function
CN111222620A (en) * 2020-01-07 2020-06-02 深圳毅能达金融信息股份有限公司 Intelligent card made of wood and production process thereof
CN113854968A (en) * 2021-10-28 2021-12-31 浙江智柔科技有限公司 Cervical opening size monitoring device, method and application thereof
CN117204796A (en) * 2023-11-09 2023-12-12 哈尔滨海鸿基业科技发展有限公司 Multispectral imaging method and device of abdominal cavity endoscope
CN117204796B (en) * 2023-11-09 2024-02-13 哈尔滨海鸿基业科技发展有限公司 Multispectral imaging method and device of abdominal cavity endoscope

Similar Documents

Publication Publication Date Title
CN104814712A (en) Three-dimensional endoscope and three-dimensional imaging method
US11190752B2 (en) Optical imaging system and methods thereof
US20160128553A1 (en) Intra- Abdominal Lightfield 3D Endoscope and Method of Making the Same
JP6985262B2 (en) Devices and methods for tracking the position of an endoscope in a patient's body
US9220399B2 (en) Imaging system for three-dimensional observation of an operative site
US20090259102A1 (en) Endoscopic vision system
US20140336461A1 (en) Surgical structured light system
US11944265B2 (en) Medical imaging systems and methods
CN104434001B (en) Monocular endoscope system based on omnibearing three-dimensional stereovision
WO2013163391A1 (en) Surgical structured light system
JP2017518148A (en) Quantitative 3D imaging of surgical scenes from a multiport perspective
US20220079424A1 (en) Wireless swivel camera laparoscopic instrument with a virtual mapping and guidance system
US12008721B2 (en) Mixed reality systems and methods for indicating an extent of a field of view of an imaging device
KR101772187B1 (en) Method and device for stereoscopic depiction of image data
US20210096351A1 (en) Endoscope processor, display setting method, computer-readable recording medium, and endoscope system
KR101595962B1 (en) Colnoscopy surgery simulation system
CN106231986A (en) Image processing apparatus
CN102078176A (en) Three-dimensional rigid electronic colposcope system and application method thereof
CN115919239A (en) Imaging method for 3D endoscopic imaging system and 3D endoscopic imaging system
CN101889853A (en) Three-dimensional endoscope system capable of rotating freely for angles
CN113366414A (en) System and method for facilitating optimization of an imaging device viewpoint during an operating session of a computer-assisted operating system
Karargyris et al. 3D representation of the digestive tract surface in Wireless Capsule Endoscopy videos
EP3871193B1 (en) Mixed reality systems and methods for indicating an extent of a field of view of an imaging device
CN102090879B (en) Three-dimensional hard electronic cholecystoscope system
CN102090880B (en) Three-dimensional hard electronic arthroscope system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150805