EP1994501A1 - Verfahren und vorrichtung zum erzeugen einer strukturfreien fiberskopischen aufnahme - Google Patents
Verfahren und vorrichtung zum erzeugen einer strukturfreien fiberskopischen aufnahmeInfo
- Publication number
- EP1994501A1 EP1994501A1 EP07723228A EP07723228A EP1994501A1 EP 1994501 A1 EP1994501 A1 EP 1994501A1 EP 07723228 A EP07723228 A EP 07723228A EP 07723228 A EP07723228 A EP 07723228A EP 1994501 A1 EP1994501 A1 EP 1994501A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- sensor
- intensity
- imaging parameters
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims description 73
- 239000000835 fiber Substances 0.000 claims abstract description 86
- 238000003384 imaging method Methods 0.000 claims abstract description 72
- 239000013307 optical fiber Substances 0.000 claims abstract description 59
- 230000003287 optical effect Effects 0.000 claims abstract description 17
- 238000009826 distribution Methods 0.000 claims description 49
- 238000012545 processing Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 2
- 238000010521 absorption reaction Methods 0.000 claims 1
- 230000001419 dependent effect Effects 0.000 claims 1
- 230000002123 temporal effect Effects 0.000 claims 1
- 230000033001 locomotion Effects 0.000 description 44
- 230000006870 function Effects 0.000 description 13
- 238000001514 detection method Methods 0.000 description 10
- 239000013598 vector Substances 0.000 description 9
- 230000008901 benefit Effects 0.000 description 6
- 238000005286 illumination Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000004807 localization Effects 0.000 description 4
- 238000005457 optimization Methods 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 3
- 238000001444 catalytic combustion detection Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000000873 masking effect Effects 0.000 description 3
- 230000000704 physical effect Effects 0.000 description 3
- 238000012805 post-processing Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005253 cladding Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001839 endoscopy Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000003365 glass fiber Substances 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 206010016654 Fibrosis Diseases 0.000 description 1
- 206010034960 Photophobia Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000004761 fibrosis Effects 0.000 description 1
- 239000013305 flexible fiber Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 208000013469 light sensitivity Diseases 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000000275 quality assurance Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/24—Coupling light guides
- G02B6/42—Coupling light guides with opto-electronic elements
- G02B6/4298—Coupling light guides with opto-electronic elements coupling with non-coherent light sources and/or radiation detectors, e.g. lamps, incandescent bulbs, scintillation chambers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/26—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/04—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings formed by bundles of fibres
- G02B6/06—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings formed by bundles of fibres the relative position of the fibres being the same at both ends, e.g. for transporting images
Definitions
- the present invention relates to a device and a method which provide the possibility of producing interference-free images by means of fiber-optic systems, and in particular with the inclusion of an image-forming system in which an image is imaged onto a sensor by means of a bundle of several ordered light guides is, can be suitably edited to obtain an image without disturbing structures.
- Optical systems in which an image is transferred to an imaging sensor via an optical system are widely used. Without the imaging of endoscopes, many applications would be unthinkable today, among others in the field of diagnostics, inspection, quality assurance and research.
- lens-optical systems are used, ie systems with a rigid structure within which the image is transmitted through a lens arrangement similar to a lens of a camera on the sensor.
- fiber optic systems are used, which consist of a large number of ordered light-conducting in a bundle zusainmengeutzter fibers, the light is passed through the plurality of fibers on a sensor.
- the image director of high-quality fiberscopes consists of a regularly ordered bundle of about 5,000 to 8,000 individual fibers. Compared with the resolution of a conventional moving picture camera (for example VGA: 640 x 480> 300,000 pixels), this value is well below the limit for useful applications.
- the image signal carried by the single fibers is observed with such a conventional motion picture camera.
- the individual optical waveguides usually have a cladding, such that disruptive structures in the observed image result from the cladding, which structures can be smoothed for example by low-pass filters or adaptively reduced by spectral masking.
- honeycomb-shaped sheathing structures In order to remove the structures introduced by the honeycomb structure and highly disturbing to evaluate an image, solutions already exist which interpolate a honeycomb-free image on the basis of the brightness information of the fibers. Likewise, the smoothing of the honeycomb-shaped sheathing structures is achieved, for example, by their masking in the Fourier space. Masking has the disadvantage that, although it improves the visual impression of the recorded image, it does not increase the accuracy with which the location of the image can be determined.
- German patent DE 4318140 A1 A problem which is generally to be solved is dealt with in German patent DE 4318140 A1. There is described how the positions of the points of light through the individual glass fibers can be imaged on a higher resolution sensor can be determined.
- the patent shows how, on the basis of the particular fiber coordinates, it is possible to associate the position of the fibers on the input side of the optical fiber bundle with the position of the points of light produced by the fibers on the sensor. Although this method prevents a misrepresentation caused by non-parallel fibers in the fiber bundle, no increase in resolution can be achieved in this way.
- the prior art corresponding method for image processing of an image recorded by means of a fiber optic system have the disadvantage that although the quality of presentation or the subjective perception quality of the images is improved, but they initially do not cause an actual increase in resolution, since to improve the resolution of the introduction of additional (BiId-) information is necessary.
- this prior knowledge can be introduced per single image in order to actually increase the resolution (for example by using edge-preserving filters). For example, if it is known that a straight-line jump in intensity takes place within the recording, the course of the edge in the recording can be determined with higher precision than the resolution of a single pixel by suitably applying a filter.
- the object or the shape of the object to be observed is usually not known a priori, so that such methods are not generally possible.
- the information difference or the information redundancy of several successive recordings can be made from varying viewing positions or directions combined to reconstruct an image that has a higher resolution than a single image.
- conventional video sequences ie successive individual images consisting of a rectangular grid of pixels, such methods are used under the collective term "superresolution.”
- superresolution there are first approaches for expanding the method for increasing the resolution of pixels that are used in arbitrary grating structures, ie non-rectangular coordinates.
- the image information used here is that intensity which results from the summation of the pixels, which are located within a defined radius around a central pixel.
- the method described there and other prior art methods have the great disadvantage that the localization of the fiber centers on the sensor can be done only with pixel accuracy, ie with the accuracy that is determined by the size of a single photosensitive pixel. This degrades the applicability of images thus generated for subsequent motion compensation that is just based on generating additional image information through motion detection, wherein the motion detection can be made with an accuracy that is below the pixel size.
- a localization of the motion compensation underlying fiber centers with pixel accuracy therefore inevitably leads to a resolution that remains well below the possible theoretical value.
- the summation of the light intensities in a circular area around a central pixel is only conditionally suitable for describing the total intensity transmitted by an optical fiber. For example, a deviation from a circular geometry of the light spot, as may occur when a single fiber or the fiber bundle has an angle relative to the sensor, results in the total intensity not being determined correctly.
- the present invention has for its object to provide an apparatus and a method by means of which the generation of interference-free images for fiberskopische systems with higher spatial resolution and more precisely determined intensity values is possible.
- the present invention is based on the finding that an image freed from interfering structures can be generated by means of a fiber bundle from a plurality of light guides and a sensor, if image parameters are provided for the fiber bundle and sensor system Describe properties of the intensity profile generated by each individual light guide on the sensor.
- image reconstruction an amplitude value or brightness information is generated for each individual optical waveguide in that a function of the amplitude value and the imaging parameter of the relevant optical waveguide is adapted to an intensity image of the sensor, so that an optimal amplitude or brightness value is taken into account geometric imaging properties can be generated for each individual fiber.
- the imaging parameters describe the geometric shape of the intensity distribution as produced by each individual light guide on the sensor.
- the imaging parameters can be determined, for example, by a calibration to be performed once, in which the fiber bundle is preferably illuminated homogeneously and in which a parameterization is adapted to the individual intensity profiles of the individual optical fibers. This has the great advantage that by adjusting a parameterization, the coordinates of the fiber center or the center of the image of a fiber on the sensor can be determined with an accuracy that depends essentially on the statistics (ie the illumination intensity and the exposure time) and which can far exceed the accuracy or the spatial resolution of the individual pixels of a sensor.
- the spatial extent of the intensity distribution as a function of the distance to the center of the same be described by suitable parameters. This can be done, for example, by standard deviations of a Gaussian function adapted to the intensity distributions, which may be symmetrical, but may be asymmetrical, ie has different extents in the X and Y directions on the sensor.
- suitable parameters such as, for example, a tilting of the light guide or light fiber relative to the sensor, resulting in an elliptical shape of the light intensity on the sensor Has.
- the intensity of the radiation transmitted by means of a single optical waveguide can be accurately detected, which is not possible with prior art methods which merely sum up the intensity of a number of fixed radius pixels around a central pixel.
- a calibration is first carried out in which the characteristic parameters of a two-dimensional Gaussian distribution, ie the center (X and Y coordinates) and the width of the distribution in two dimensions ( ⁇ x and ⁇ y ).
- This information is used in the processing of images taken by the fiberscopic system to produce an optimized amplitude value for each individual sensor.
- fiber optics is used individually for the intensity distributions whose center and position are already known by the calibration, the parameterization or mapping function used for the calibration is used in order to adapt these optimally to the actually measured ones.
- the amplitude parameter which can be freely determined during the adaptation is thus a measure of the total light intensity transmitted by the fiber and is used in the further course of the image processing as brightness information of the fiber concerned.
- the calibration ensures that the position of the fiber centers is known with an accuracy that far exceeds the size of the individual pixels.
- the individual images thus offer excellent spatial resolution as well as amplitude and brightness values that are determined with the greatest possible precision.
- the image is imaged onto an imaging sensor by means of a fiber optic system consisting of many individual optical fibers such that the intensity profile caused by a single fiber on the surface of the sensor has a spatial extent greater than a single photosensitive Element or pixel of the sensor.
- a low-resolution image can be calculated from an intensity image of the sensor which stores the image information transmitted by each individual optical waveguide in the form of an intensity value.
- intensity image In the case of color images, of course, it is also possible to store three or more intensity values per light guide in order to obtain the color information.
- a single low resolution image thus contains a number of pixels associated with individual fibers and which need not be arranged in a rectangular grid. Due to the generation of only a single pixel per light guide, the honeycomb structure of the light guide arrangement, as can be seen on the individual sensor recordings with a higher resolution, is successfully suppressed.
- the motion detection in successive recordings is made possible, so that according to the invention a freely definable number of successive recordings can be used to detect a movement at locations of the image or a translational movement of the entire image and to derive associated motion vectors.
- the motion estimation can be performed with an accuracy that exceeds the original image resolution.
- the motion estimation can be carried out either on the basis of the irregular grid of the individual points of light per image guide.
- a regular rectangular grid of pixels for example, but not necessarily in the resolution of the sensor used, may be generated by interpolation based on the irregular grid.
- interpolation is to to obtain values for each pixel of the regular grid by the baricentric weighting of the three closest intensity values to the sampled pixel. Baricentric weighting is based on weighting and superimposing the intensity values of the three nearest points, that is, the points forming a triangle within which the scanned pixel is located, with the distance to the point of interest to determine the intensity value for the scanned pixel ,
- a further example of how a continuous image can be interpolated on the basis of the punctiform brightness or amplitude values is by using the Delaunay triangulation or by using the dual Voronoi cells, as will be explained in more detail below.
- the detected movements of the individual images are transformed back and superimposed with respect to a freely selectable reference image or reference time.
- An increase in resolution results now from the fact that the motion vectors can be determined with an accuracy that is higher than the intrinsic resolution or the distance between two neighboring pixels.
- the superimposed image therefore, there are several pixels for different moving objects which can be shifted by less than the intrinsic resolution of a single image, so that a combination of the images results in an increased resolution of the reconstructed image.
- a still image can be generated or even an image sequence or a film, in which a few consecutive individual images are used to produce an image of the film or of the image sequence.
- FIG. 2 shows the imaging of optical fibers onto square photosensitive pixels
- 3 is a flow chart for a method of generating high resolution images
- FIG. 10 shows a flow chart for a method according to the invention for generating imaging parameters of a light guide.
- FIG. 1 shows an example of a system having a device for generating a high-resolution image or an evaluation device 100. Shown is, moreover, a fiber bundle 102 and a sensor 104, which is connected to the evaluation device according to the invention, so that the latter is the one Control exposure times of the sensor 104 can, as well as on the other hand can read the image data of the sensor 104.
- An image or object is imaged on the sensor 104 by the fiber bundle 102 consisting of a plurality of individual light guides 106.
- the imaging situation of the individual optical fibers on the sensor is shown in detail with reference to FIG. 2, in which approximately circular intensity distributions (for example intensity distribution 110) are shown overlapping the square pixel matrix 112 of, for example, a CCD sensor.
- a single optical fiber exposes a plurality of pixels of the pixel array 112, with individual pixels completely illuminated by the fiber and only partially at the edges of the circular intensity distribution 110 be illuminated so that even with completely homogeneous illumination of the fiber and idealized light transport through the fiber at the edges of the fiber image pixels are only partially illuminated, so detect an intensity that is lower than that of the pixel in the center ,
- the intensity may additionally decrease at the edge of the circular cross-section of a fiber optic fiber.
- the array of fiber bundle 102 and sensor 104 has a set of imaging parameters that specify, among other things, the imaging location of the optical fibers on the sensor.
- the location of the picture is to be understood as being the center indicates the area irradiated by a single light guide on the sensor.
- points 114a, 114b and 114c are shown in FIG.
- the points 114a to 114c which indicate the center of the light distributions of individual light guides or fibers on the sensor 104, are thereby determined during a calibration, as will be explained in more detail below.
- the senor 104 is initially controlled by the device for generating a high-resolution image 100 such that it receives a series of successive intensity images.
- a low-resolution image is first derived from the device for generating a high-resolution image 100 from each intensity image.
- a pixel with associated intensity values is first determined for each individual fiber, so that the resulting low-resolution image initially has a number of pixels which corresponds to the number of optical fibers 106 of the fiber bundle 102.
- a smaller number of pixels can be generated.
- An image generated according to the invention therefore consists of Sercenter corresponding individual pixels, which have no such honeycomb structure and are not necessarily arranged in a uniform rectangular grid.
- the spacing of adjacent pixels is greater than the spacing of adjacent photosensitive pixels of the sensor 104.
- the image which has pixels corresponding to individual fiber centers can now either be used directly as a low-resolution image or, since the honeycomb structure is already removed within the image, a low-resolution image can be obtained by interpolation on the basis of this image has uniform arrangement of pixels.
- the pixels 114a to 114c can be scanned, for example, with a grating 112, as can be seen in FIG. 2, wherein the intensity values for each pixel of the grating 112 are obtained, for example, by interpolation of the three nearest neighboring imaging locations or their associated intensity values , In the example of Fig.
- the three imaging locations 114a-114c form a triangle, so that for the square pixels within this triangle, the intensity values can be obtained from a weighted superposition of the three intensity values of the imaging locations 114a-114c.
- this weighting may be baricentric, i. the intensity values at the imaging locations are weighted and superposed with the distance of the respective imaging locations to the pixel to be interpolated.
- a suitable parameterization can be adapted to the intensity distribution in order to obtain a brightness or amplitude value.
- a low-resolution image may be generated which has the advantage of regularly arranged pixels, thus allowing mathematically simpler post-processing of the image.
- the pixel size with which the image of the imaging locations is scanned is not given but freely selectable and adaptable to the respective needs or conditions.
- the device for generating a high-resolution image 100 performs a motion estimation between the individual images of the series of low-resolution images, it being irrelevant whether the low-resolution images have pixel images scanned at right angles or in the grating of the image locations.
- the individual successive low resolution images are compared as to whether either the entire image or portions of the image can be detected at other geometric locations of successive images, ie, portions of the image Whole image relative to the previous shot on the sensor 104 have moved. It should be noted that the imaging sites 114a-114c will of course not move on the sensor 104 because the fiber bundle 102 is rigidly disposed with respect to the sensor 104.
- a movement is thus detected due to light intensities changing at the imaging locations 114a to 114c. If, for example, image areas corresponding to one another have been identified in two successive images and have shifted from one another, a motion vector can be defined which indicates the movement of the image area from one image to the next image.
- the relative movement of mutually corresponding image areas can be determined with an accuracy that exceeds that of the original image resolution, ie the distance between adjacent image locations.
- the apparatus for generating a high-resolution image 100 performs a back transformation of the moving objects at a reference time, for example to a previous image, and the intensity or amplitude values of the individual, back-transformed low-resolution images are superimposed.
- image areas that have changed or moved in the course of the sequence of low-resolution images new pixels added, which need not be due to the higher accuracy of the motion vectors in the original image grid.
- This situation is illustrated by the dashed pixels 116a to 116c in FIG.
- the pixels 116a to 116c newly added by inverse transformation and subsequent superposition have the effect of observing the object to be observed in the original image with a higher resolution than the sensor or the arrangement of sensor and light guides allows intrinsically.
- the apparatus for generating a high-definition image 100 thus makes it possible, by means of the processing of a sequence of recorded intensity images taking account of imaging parameters, to produce a high-resolution image which has a higher resolution than a single intensity image.
- the device can be used in such a way that either a single stationary image is generated, or that a continuous sequence of images, ie an image sequence or a film with increased resolution is generated.
- the number of individual intensity images, which are used to generate a high-resolution image can be freely adapted to the respective circumstances.
- a criterion can be, for example, the desired increase in resolution, or the delay that inevitably occurs due to the acquisition of several images until a high-resolution image is generated.
- the delay may be relevant, for example, if a real-time observation by means of the method according to the invention is to be made, whereby naturally a series of intensity images must be performed and processed until the first high-resolution image of the image sequence can be made visible on a screen.
- the high-resolution image can be scanned or generated in a freely selectable resolution, wherein the scanning grating can correspond, for example, to the physical resolution of a screen for displaying the result.
- a sequence of intensity images is first to be recorded in an image acquisition step 150.
- the generating step 152 low-resolution images are generated from the intensity images by using imaging parameters indicating, for example, an imaging location of the optical fibers 106 on the sensor 104.
- the successive low-resolution images are analyzed and relative motion vectors are determined for moving parts of the image or in the event that the entire image has moved.
- the synthesis step 156 the high-resolution image from the low-resolution images is combined by transforming back the detected motions and superimposing the back-transformed images on each other.
- the generated high resolution image may be scanned in an, for example, rectangular coordinate grid to output to an external display device.
- Fig. 4 shows experimental results as can be obtained by applying a resolution enhancement method. Shown is the result of a processing of images from a flexible endoscope with fiber optic image guide.
- Fig. A shows the original image recorded by the sensor at the proximal end of the endoscope has been.
- Fig. B shows how an image can be generated by baricentric interpolation whose honeycomb pattern is reduced or suppressed.
- Figures c to e show from left to right the application of the method with increasing complexity.
- the result is shown using five frames
- Fig. D the result using ten frames
- Fig. E the result using fifteen frames to increase the resolution is shown.
- the resolution continuously improves.
- FIG. 5 shows an example of how the use of the method can improve the readability of a writing recorded by means of a fiber optic endoscope.
- FIG. A an enlarged section of a receptacle after the reduction of the honeycomb structure is shown in FIG. A, the FIGURE b showing the same section according to the method of motion compensation according to the application.
- the resolution can be increased in reality by the motion detection method, since parts of the fonts are only readable in FIG.
- the advantage of this method is that information about the structure of the observed object need not be known in order to increase the resolution.
- a straight line 200 as can be seen, for example, in FIG.
- the motion compensation was done with different image materials. al with legible and unreadable, structured and unstructured as well as transversal or longitudinally shifted content examined and compared. Apart from extremely structurally weak scenes, which do not contribute to motion detection and thus offer no approach for increasing the spatial resolution. In particular, the number of images used around a basic image and the different structure in the image were examined for subjective perception as influencing factors. Implementation of the procedure for the post-processing of fiberscopic imagery confirms the significant increase of details (readable writing, edges, structures) when combining several single exposures.
- the method is suitable for both transverse and longitudinal movement of the endoscope tip and is robust against non-homogeneous movement within the images.
- the method described above can be classified or designated as an increase in resolution in the local area with interpolation from unequally distributed grids.
- a modular separation between the localization of image information, the interpolated motion estimation, the generation of a high-resolution grating and its Cartesian scanning with baricentric extension of interpolation points enables the separate use and implementation of the respectively used procedures in software and hardware.
- the adaptation and optimization of the method to any grid structure for obtaining higher-resolution image information offers three major advantages:
- the method can be realized by parallelizing process steps in signal processor technology and thus be used as a module to increase resolution directly between the digitizing sensors after the fiber optics and a display / playback or further processing.
- This step may be completed by a preprocessing step, e.g. Interpolation of the image material.
- the actual physical properties of a fiberscopic image recording system are exploited in order to efficiently remove the image artifacts or the interfering structures, while at the same time retaining the entire image information.
- the method according to the invention for producing a structure-free fiberscopic recording is based on firstly carrying out a calibration of the fiberscopic system and, on the basis of the findings obtained during the calibration, performing image reconstruction steps in real time.
- the calibration only has to be done once, but the reconstruction is required for every single frame of an image stream.
- One of the aims of the calibration is to determine the centers of all optical fibers with an accuracy that is greater than the size of the pixels of a sensor (ie with subpixel accuracy).
- An image used for the calibration can be obtained, for example, by subtracting a homogeneous white area (for example a sheet of paper). is formed.
- an artifact-free output image is reconstructed, which is constructed on the basis of the intensity values obtained at the locations of the centers of the light guides with subpixel accuracy.
- the position of the centers of the optical fibers is determined by means of a calibration image in three stages. First, with pixel accuracy (that is, with a location resolution corresponding to the physical size of the pixels of the sensor), potential candidate points for the position of the center of the light pipe are determined. A significance value that indicates how well a two-dimensional Gaussian parameterization with specified parameters describes the intensity distribution around the candidate point is assigned to each candidate point.
- an unbalanced two-dimensional Gaussian parameterization is fitted to the intensity acquired by the sensor (for example, using nonlinear optimization).
- the fitting results in a position of the light guide center with a subpixel accuracy, with a further significance value being obtained for each adjustment made.
- the actual centers of the optical fibers are then determined from the group of candidate points based on the significance values, and in addition a distance criterion describing the minimum reasonable distance between two adjacent centers of optical fibers is used.
- the group of adjacent light guides which is suitable for the application of the distance criterion can be determined, for example, by a Delaunay triangulation of the centers of the light guides, which can also serve as the basis for a subsequent image reconstruction.
- FIG. 6 shows three images, the left image 6 a representing a two-dimensional intensity image, as it is recorded by a fiber bundle during the calibration by means of a pixel sensor.
- FIG. 6a shows three images, the left image 6 a representing a two-dimensional intensity image, as it is recorded by a fiber bundle during the calibration by means of a pixel sensor.
- FIG. 6a shows three images, the left image 6 a representing a two-dimensional intensity image, as it is recorded by a fiber bundle during the calibration by means of a pixel sensor.
- the structures assigned to the individual optical fibers are clearly recognizable, but the exact position of the center of the intensity distribution can not be determined since several pixels are exposed with high intensity by means of each fiber.
- the candidate points are selected based on their brightness, which is set in relation to the brightness of the local neighborhood of the candidate point.
- a symmetrical rectangular neighborhood N is used.
- the size of the neighborhood for example, preferably depends on the diameter of a single optical fiber, which is approximately constant for all optical fibers of a fiber bundle.
- the coordinate system 300 is defined in FIG. 6 a, so that a coordinate in the coordinate system 300 can be assigned to individual pixels.
- the minimum and maximum intensity I n ⁇ n and Imax within the neighborhood N (X, Y) are first determined for each position (X, Y) in the calibration image and their coordinates (X min , Y min ) and (X ma ⁇ , Ymax) localized.
- the point ⁇ X ma ⁇ r Ymax is identified as a candidate point for a center of a light guide if and only if I m3x - I m in> D m i n ,
- a significance value indicating how well a two-dimensional symmetric Gaussian distribution describes the data in the vicinity of the candidate points is obtained by comparing the environment of the candidate point with the Gaussian distribution determined (template matching).
- a template window corresponding to a neighborhood N is filled with the discrete values of a symmetric two-dimensional Gaussian distribution centered around the center of the example environment. For each example window, each of which corresponds to a candidate point Pi, a significance value s (pi) corresponding to the common RMS metric is calculated:
- T (x, y) and I (x, y) correspond to the intensities of the individual pixels at the positions (x, y) of the example window and the neighborhood N (pi) of the candidate point pi.
- points with a lower corresponding significance value or their surroundings agree better with the example environment. That is, the probability that the candidate point pi in question is near the actual center of the optical fiber is higher for low significance points than for high significance points.
- FIG. 1b shows the result of the above operation and thus the candidate points and the associated significance values in greyscale representation.
- the candidate points are now reevaluated by matching (fitting) two-dimensional, asymmetrical Gaussian distributions to the surroundings of the candidate points.
- a two-dimensional, not symmetrical Gaussian distribution f (x, y; v) of a parameter vector v ( ⁇ x, ⁇ y, ⁇ x, ⁇ y, a) defines the (x U, Py) the center of the Gaussian distribution, the standard deviations in x and y direction ⁇ x , ⁇ y and an amplitude or an amplitude value a includes.
- the full Gaussian function is thus written as:
- the above function can be applied to the neighborhood N (pi) by nonlinear optimization of the following quadratic spacing problem by variation to be adapted to the vector v:
- r (pi) which is also referred to as residuum
- various minimization algorithms can be used, such as the Levenberg-Marquardt method.
- Levenberg-Marquardt method With convergent optimization for a candidate point Pi, one obtains an improved subpixel accurate position ( ⁇ X / ⁇ y ) of the center of the fiber as the center of the Gaussian distribution, where r (pi) can be interpreted as a re-refined significance value for the candidate point (pi).
- the Delaunay triangulation is preferably used, which immediately supplies the natural neighbor adjacent to the candidate point.
- a triangulation D (P) of P is performed such that there is not a single point in P within the perimeter of a single triangle of triangulation.
- a Delaunay triangulation maximizes the minimum angle of triangulation triangulation within each triangle.
- the Voronoi diagram V (P) which describes the triangulation dual-graph.
- Fig. 7a shows a number of points as an example, and the Delaunay triangulation adapted to the point distribution, which is represented by bold lines delimiting the triangles of the triangulation.
- all the points in FIG. 7 a are arranged within areas, wherein the so-called Voronoi cells surrounding the individual points are created on the basis of the triangulation by forming boundary lines delimiting the Voronoi cells in that every single connecting line between two points, which has been created by the triangulation, a mid-plane is constructed, as can be seen in Fig. 7a.
- Delaunay triangulation is relevant in that it directly provides the natural neighbors of each point.
- the natural neighbors of a point are defined by the fact that each natural neighbor qi of a point pi is connected to the point pi by a connecting line of a triangle of triangulation.
- FIG. 7b shows the Voronoi structure of the situation of FIG. 7a, showing an additionally inserted point p in the center of the point distribution whose natural neighbors are the points qi... Q 5 .
- a candidate point or a coordinate pair ( ⁇ x , ⁇ y ) is accepted in the present embodiment of the present invention as the center of a light guide if it has the highest significance value r (pi) among the available candidates for the relevant light guide and additionally a distance condition is met.
- the distance condition is based essentially on the fact that the diameter of the individual optical fibers d f i be r substantially for all light guide of an endoscope or a fiber bundle is constant. Since, for physical reasons, the light guides can not spatially overlap, a minimum distance d m i n between two found centers of
- the order in which the individual points are inserted into the triangulation is determined by the significance value, with candidates of high significance (low calculated distance square), ie a high probability of being the actual center of the light guide, inserted first. If such a point of high significance is inserted and if this is the distance criterion, then this point is regarded as the center of the respective light guide. It is ensured by the distance criterion that only a single point per light guide (ie a single center) is inserted into the Delaunay triangulation. Mathematically, the distance criterion can be described as follows:
- D q ⁇ be the distance of the candidate point to be inserted in the triangulation to its natural neighbor qi.
- the candidate point is inserted into triangulation if and only if all i are:
- a tuple of four values is available as an imaging parameter for each individual fiber. These are the coordinates of the center of the light guide ⁇ x and ⁇ y and the widths of the two-dimensional asymmetric Gaussian distribution ⁇ x and ⁇ y .
- Fig. 6c shows as a result of the calibration the fiber centers found for the acquisition of Fig. 6a.
- these two parameters describe geometric properties of the system of sensor and optical fiber, namely the exact imaging location as well as the geometric extension of the intensity distribution caused by a single fiber, these parameters do not change over time.
- the transmitted intensity or the transmitted amplitude value of each individual light guide during the image acquisition and analysis can be determined exactly.
- the intensity information transmitted by a single optical fiber and corresponding to the amplitude a of the two-dimensional Gaussian distribution describing the intensity distribution of a single optical fiber on the sensor must be determined. If you have individual, precisely determined intensity or amplitude values for each individual optical fiber, artifact-free images can be generated by interpolating the intensity information at the centers of the optical fibers into a higher-resolution grating, so that continuous image display in the higher-resolution grating can be achieved, as further will be explained in more detail below. For this purpose, the exact determination of the amplitude value a from equation 1 is first required.
- the characteristic properties of each light guide, the parameter vector v were obtained. Both the center of the light guide ( ⁇ x / ⁇ y ) and the standard deviations ⁇ x and ⁇ y are constant over time, regardless of which image content is recorded with the fiberscope. However, the amplitude value a changes with changing picture content and has to be redetermined for every shot. This can preferably be done, for example, by solving a simple minimal equidistant quadratic problem (see equation 3):
- a good initial value for the parameter a to be determined or freely set is, for example, the intensity I ( ⁇ x , ⁇ y ) which can be obtained by bilinear interpolation at the subpixel-accurate location of the center of the optical waveguide ( ⁇ x / ⁇ y ).
- the minimization of equation 3 can be carried out in any suitable manner, for example by using the Newton-Rhapson method.
- the method according to the invention for determining the amplitude value has the great advantage that the real image information, which is present in the form of the imaging parameters, is used to determine the amplitude value.
- Ellipsoidal intensity distributions on the surface of the Sensors can thus be sufficiently considered, as they occur, for example, when the optical fiber has an angle relative to the surface of the sensor is not equal to 90 °.
- a continuous, two-dimensional image can be interpolated from the discrete pixels and the amplitude values of the optical fibers associated therewith.
- Voronoi cells are used to determine the interpolated intensities , as described below.
- the concept of natural neighbors is used to determine neighboring points of points of a non-uniformly distributed set of points.
- the natural neighbors of a point P s ⁇ n ⁇ are defined as the neighbors of the point P of the Delaunay triangulation of the set of points P u ⁇ * ⁇ . This is equivalent to the natural neighbors being those points of P whose Voronoi cells are truncated when an additional point P is inserted into the Voronoi diagram, as illustrated, for example, with reference to Figure 7c.
- V (qi) is the Voronoi cell of the natural neighbor qi of p, and if p is also an additional point inserted in the triangulation having the Voronoi cell V (p), as shown in Fig. 7C, so you can do the following considerations.
- the natural region N ⁇ jp, q t ⁇ is defined as that part of V (q) which is removed from the Voronoi cell V (qi) of qi upon insertion of the point P by the Voronoi cell V (p).
- N ⁇ p, q t is thus V (p) n H (qi).
- a (qi) denotes the area of N ⁇ p, q t ).
- the natural coordinate ⁇ q ⁇ (p) of qi is then defined as the area ratio:
- the area around the point P shown hatched in FIG. 7C is, for example, the natural area N (p, q 3 ).
- the natural coordinates have the following important properties.
- the third criterion just described which relates to the property of the local coordinates, can now be used to interpolate the intensity of an additionally inserted picture element (pixel) p from the intensities determined at the centers of the light guides qi. It should be noted that when adding a point p in the triangulation, the natural neighbors qi of the triangulation, whose positions were determined during triangulation, are used to determine the intensity or amplitude value I (p).
- I (p) is the convex sum of the intensities I (qi):
- the resulting image is a simply differentiable continuous function of the intensities I (q) based on the nodes of the centers of the optical fibers q.
- both the natural neighbors and the natural coordinates of a pixel p remain constant in time, so long as the size, i. the resolution of a reconstructed interpolated image is not changed, both the natural neighbors and the natural coordinates per configuration can be calculated only once and stored in the form of a lookup table. Once this complex calculation has been carried out, individual images can be reconstructed without problems in real time.
- FIG. 8 shows an example of how, using this concept according to the invention, it can be successfully distinguished from an artifact-based
- the image clearly showing the honeycomb structure of a fiberscopic untreated image (Fig. 8a) has an artifact-free, continuous two-dimensional image (Fig. 8b) which is excellent, for example, for subsequent motion compensation based on perform the reconstructed artifact-free image.
- FIG. 9 shows a flowchart in which the essential steps for generating an image according to the invention are shown.
- imaging parameters of a light guide are initially provided which describe geometric properties of an intensity profile generated by the light guide on the sensor.
- an intensity image is taken of the sensor.
- an amplitude value for all the optical fibers ai is determined by adjusting a function of the amplitude value and the imaging parameters to the intensity recording.
- the amplitude values are used as image information of the individual optical fibers to generate the image.
- FIG. 10 shows a flow chart for a method according to the invention for generating imaging parameters of a light guide.
- an intensity distribution generated by the light guide on the sensor is recorded.
- mapping parameters are adjusted by adjusting a function of Imaging parameters and an amplitude value to the intensity distribution in a preliminary imaging environment, the extent of which depends on the nature of the light guide determined.
- step 514 it is checked whether location information in the imaging parameters can be deduced to indicate an optical waveguide position which has a predetermined minimum distance to an optical waveguide position of the other optical waveguides.
- the method is based on an iteration path and determines imaging parameters for a next candidate point of the light guide to be found.
- the procedure is based on a memory path 518 and the mapping parameters for the candidate point are stored in a memory step 520.
- the present invention describes a concept of how artefacts or unwanted structures within the fiberscopic recordings can be removed by means of an editing algorithm for fiberscopic recordings, which uses the actual physical properties of the fiberscopic imaging, without changing the actual image information.
- the removal of the fiberscopic honeycomb structure in real time is made possible, in particular, by the method according to the invention.
- any other photosensitive sensor arrays are also suitable for implementing the method according to the invention, such as arrays of photodiodes and other photosensitive elements such as photocathodes and photomultiplier tubes.
- the most homogeneous possible illumination of the fiber bundle is advantageous for determining the fiber center positions, ie the imaging locations of the light guides on the sensor, a determination of the fiber centers can also be carried out by means of an inhomogeneous illumination.
- a different light sensitivity of individual pixels of the sensor can also be taken into account, which, for example, can additionally be stored in the imaging parameters if they were obtained with homogeneous illumination of the fiber bundle.
- the method according to the invention generating a high-resolution image can be implemented in hardware or in software.
- the implementation can be carried out on a digital storage medium, in particular a floppy disk or CD with electronically readable control signals, which can interact with a programmable computer system in such a way that the method according to the invention for producing a high-resolution image is executed.
- the invention thus also consists in a computer program product with a program code stored on a machine-readable carrier for carrying out the method according to the invention, when the computer program product runs on a computer.
- the invention can thus be realized as a computer program with a program code for carrying out the method when the computer program runs on a computer.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Image Input (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102006011707A DE102006011707B4 (de) | 2006-03-14 | 2006-03-14 | Verfahren und Vorrichtung zum Erzeugen einer strukturfreien fiberskopischen Aufnahme |
PCT/EP2007/002218 WO2007104542A1 (de) | 2006-03-14 | 2007-03-13 | Verfahren und vorrichtung zum erzeugen einer strukturfreien fiberskopischen aufnahme |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1994501A1 true EP1994501A1 (de) | 2008-11-26 |
Family
ID=38042838
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07723228A Withdrawn EP1994501A1 (de) | 2006-03-14 | 2007-03-13 | Verfahren und vorrichtung zum erzeugen einer strukturfreien fiberskopischen aufnahme |
Country Status (4)
Country | Link |
---|---|
US (1) | US7801405B2 (de) |
EP (1) | EP1994501A1 (de) |
DE (1) | DE102006011707B4 (de) |
WO (1) | WO2007104542A1 (de) |
Families Citing this family (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
JP2010253156A (ja) * | 2009-04-28 | 2010-11-11 | Fujifilm Corp | 内視鏡システム、内視鏡、並びに内視鏡駆動方法 |
US8514491B2 (en) | 2009-11-20 | 2013-08-20 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
AU2012257496B2 (en) * | 2011-05-16 | 2017-05-25 | Mauna Kea Technologies | Continuous and real-time calibration of fiber-based microscopic images |
WO2013049699A1 (en) | 2011-09-28 | 2013-04-04 | Pelican Imaging Corporation | Systems and methods for encoding and decoding light field image files |
WO2013175686A1 (ja) * | 2012-05-22 | 2013-11-28 | パナソニック株式会社 | 撮像処理装置および内視鏡 |
WO2014005123A1 (en) | 2012-06-28 | 2014-01-03 | Pelican Imaging Corporation | Systems and methods for detecting defective camera arrays, optic arrays, and sensors |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
EP3869797B1 (de) | 2012-08-21 | 2023-07-19 | Adeia Imaging LLC | Verfahren zur tiefenerkennung in mit array-kameras aufgenommenen bildern |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
WO2014164550A2 (en) * | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US9578259B2 (en) | 2013-03-14 | 2017-02-21 | Fotonation Cayman Limited | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
EP2973476A4 (de) | 2013-03-15 | 2017-01-18 | Pelican Imaging Corporation | Systeme und verfahren zur stereobildgebung mit kameraarrays |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
US9426361B2 (en) | 2013-11-26 | 2016-08-23 | Pelican Imaging Corporation | Array camera configurations incorporating multiple constituent array cameras |
WO2015134996A1 (en) | 2014-03-07 | 2015-09-11 | Pelican Imaging Corporation | System and methods for depth regularization and semiautomatic interactive matting using rgb-d images |
US8995798B1 (en) * | 2014-05-27 | 2015-03-31 | Qualitrol, Llc | Reflective element for fiber optic sensor |
CN107077743B (zh) | 2014-09-29 | 2021-03-23 | 快图有限公司 | 用于阵列相机的动态校准的系统和方法 |
DE102015002301B4 (de) | 2015-02-24 | 2022-11-10 | Johann Biener | Geräte zur optischen Beobachtung von astronomischen Vorgängen und Bildern, mit Ersatz von Hohlspiegeln durch Pixel-Digitaltechnik |
CN107678153B (zh) * | 2017-10-16 | 2020-08-11 | 苏州微景医学科技有限公司 | 光纤束图像处理方法和装置 |
CN107621463B (zh) * | 2017-10-16 | 2024-03-22 | 苏州微景医学科技有限公司 | 图像重建方法、装置及显微成像装置 |
US12053147B2 (en) * | 2017-12-18 | 2024-08-06 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Multi-field miniaturized micro-endoscope |
EP3518017B1 (de) * | 2018-01-24 | 2020-06-17 | Technische Universität Dresden | Verfahren und faseroptisches system zur beleuchtung und detektion eines objekts mit licht |
WO2021055585A1 (en) | 2019-09-17 | 2021-03-25 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
MX2022004162A (es) | 2019-10-07 | 2022-07-12 | Boston Polarimetrics Inc | Sistemas y metodos para el aumento de sistemas de sensores y sistemas de formacion de imagenes con polarizacion. |
DE102019132384A1 (de) * | 2019-11-28 | 2021-06-02 | Carl Zeiss Meditec Ag | Verfahren zum Erstellen eines hochaufgelösten Bildes, Datenverarbeitungssystem und optisches Beobachtungsgerät |
KR20230116068A (ko) | 2019-11-30 | 2023-08-03 | 보스턴 폴라리메트릭스, 인크. | 편광 신호를 이용한 투명 물체 분할을 위한 시스템및 방법 |
CN115552486A (zh) | 2020-01-29 | 2022-12-30 | 因思创新有限责任公司 | 用于表征物体姿态检测和测量系统的系统和方法 |
WO2021154459A1 (en) | 2020-01-30 | 2021-08-05 | Boston Polarimetrics, Inc. | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
DE102020107519A1 (de) | 2020-03-18 | 2021-09-23 | Carl Zeiss Meditec Ag | Vorrichtung und Verfahren zum Klassifizieren eines Gehirngewebeareals, Computerprogramm, nichtflüchtiges computerlesbares Speichermedium und Datenverarbeitungsvorrichtung |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
WO2023055617A1 (en) * | 2021-09-29 | 2023-04-06 | Corning Research & Development Corporation | Relative mode transmission loss measurement of a connectorized fiber optic cable |
KR102553001B1 (ko) | 2021-10-05 | 2023-07-10 | 한국과학기술연구원 | 인공지능 기반의 이미지의 허니컴 아티팩트 제거 방법 및 장치 |
CN117710250B (zh) * | 2024-02-04 | 2024-04-30 | 江苏无右微创医疗科技有限公司 | 一种消除纤维镜成像蜂窝状结构的方法 |
CN118537273A (zh) * | 2024-07-26 | 2024-08-23 | 深圳大学 | 一种光纤图像中蜂窝样伪影去除和重构方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2708507A1 (de) * | 1977-02-26 | 1978-08-31 | Leitz Ernst Gmbh | Bilduebertragungseinrichtung zur untersuchung von unzugaenglichen partien eines objektes |
DE4318140C2 (de) * | 1993-06-01 | 1996-07-18 | Fraunhofer Ges Forschung | Verfahren zur Zuordnung der einkoppelseitigen Enden der einzelnen Lichtleitfasern eines Lichtleiterbündels zu den auskoppelseitigen Enden dieser Lichtleitfasern |
WO1997042600A1 (fr) * | 1996-05-02 | 1997-11-13 | Andromis S.A. | Procede de traitement d'images obtenues par fibres multicoeurs ou multifibres, en particulier d'images endoscopiques |
DE19812285A1 (de) * | 1998-03-20 | 1999-09-23 | Philips Patentverwaltung | Bildgebendes Verfahren für medizinische Untersuchungen |
AU4500100A (en) | 1999-04-30 | 2000-11-17 | Digital Optical Imaging Corporation | Methods and apparatus for improved depth resolution using out-of-focus information in microscopy |
US6663560B2 (en) * | 1999-12-17 | 2003-12-16 | Digital Optical Imaging Corporation | Methods and apparatus for imaging using a light guide bundle and a spatial light modulator |
US6885801B1 (en) * | 2001-12-06 | 2005-04-26 | Clear Image Technology Llc | Enhancement of fiber based images |
FR2842628B1 (fr) * | 2002-07-18 | 2004-09-24 | Mauna Kea Technologies | "procede de traitement d'une image acquise au moyen d'un guide compose d'une pluralite de fibres optiques" |
US20040037554A1 (en) | 2002-08-23 | 2004-02-26 | Ferguson Gary William | Non-coherent fiber optic apparatus and imaging method |
-
2006
- 2006-03-14 DE DE102006011707A patent/DE102006011707B4/de not_active Expired - Fee Related
-
2007
- 2007-03-13 WO PCT/EP2007/002218 patent/WO2007104542A1/de active Application Filing
- 2007-03-13 US US12/282,697 patent/US7801405B2/en not_active Expired - Fee Related
- 2007-03-13 EP EP07723228A patent/EP1994501A1/de not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
See references of WO2007104542A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2007104542A1 (de) | 2007-09-20 |
US7801405B2 (en) | 2010-09-21 |
DE102006011707B4 (de) | 2010-11-18 |
DE102006011707A1 (de) | 2007-09-20 |
US20090092363A1 (en) | 2009-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102006011707B4 (de) | Verfahren und Vorrichtung zum Erzeugen einer strukturfreien fiberskopischen Aufnahme | |
DE60023495T2 (de) | Verfahren und vorrichtung zur bildübertragung mittels nicht kohärentem optischen faserbündel | |
EP1943625B1 (de) | Verfahren und vorrichtung zur rekonstruktion von bildern | |
DE602004008681T2 (de) | Mikroskop-System und Verfahren | |
DE102017220104A1 (de) | Linsensystem mit variabler Brennweite mit Mehrebenen-Bildverarbeitung mit erweiterter Tiefenschärfe | |
DE102007029884A1 (de) | Verfahren und Einrichtung zum Erzeugen eines aus einer Mehrzahl von endoskopischen Einzelbildern zusammengesetztes Gesamtbildes von einer Innenoberfläche eines Körperhohlraums | |
EP3293558B1 (de) | Vorrichtung zum erfassen eines stereobilds | |
DE102013021542A1 (de) | Mikroskop und Verfahren zur SPIM Mikroskopie | |
EP2791896B1 (de) | Verfahren zur erzeugung von superaufgelösten bildern mit verbesserter bildauflösung und messvorrichtung | |
DE112014006672T5 (de) | Bildverarbeitungsvorrichtung, Bildgebungsvorrichtung, Mikroskopsystem, Bildverarbeitungsverfahren und ein Bildverarbeitungsprogramm | |
DE102015011427B4 (de) | Bildaufnahmesystem und Bildauswertesystem | |
WO2022043438A1 (de) | Verfahren, bildverarbeitungseinheit und laserscanningmikroskop zum hintergrundreduzierten abbilden einer struktur in einer probe | |
EP2147346B1 (de) | Vorrichtung und verfahren zur kompensation von farbverschiebungen in faseroptischen abbildunssystemen | |
EP2494522B1 (de) | Verfahren zur bestimmung eines satzes von optischen abbildungsfunktionen für die 3d-strömungsmessung | |
DE102018124401A1 (de) | Verfahren zum Aufnehmen eines Bildes mit einem Teilchenmikroskop | |
DE102006004006B3 (de) | Verfahren und Vorrichtung zur Erzeugung eines hoch aufgelösten Bildes für faseroptische Systeme | |
DE112014006356B4 (de) | Verfahren zur Verbesserung der Bildqualität eines Ladungsträgerteilchen-Rastermikroskops und Ladungsträgerteilchen-Rastermikroskop | |
DE10307331B4 (de) | Bildgebendes Verfahren zur rechnergestützten Auswertung computer-tomographischer Messungen durch direkte iterative Rekonstruktion | |
DE102020113454A1 (de) | Mikroskop und Verfahren zum Erzeugen eines aus mehreren mikroskopischen Einzelbildern zusammengesetzten Bildes | |
DE102020118500A1 (de) | Mikroskop und Verfahren zum Generieren eines aus mehreren mikroskopischen Teilbildern zusammengesetzten Bildes | |
DE102009037397B3 (de) | Dreidimensionale Abbildung einer Probenstruktur | |
DE102022101527A1 (de) | Messvorrichtung und Messverfahren zum Überprüfen eines Messbildzustandes | |
DE69911780T2 (de) | Verfahren zur modellierung von gegenständen oder dreidimensionalen szenen | |
EP3712670A1 (de) | Verfahren zur hochauflösenden scanning-mikroskopie | |
DE102022128079A1 (de) | Verfahren und vorrichtung zur lichtfeldmikroskopie |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20080912 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE ES FR GB |
|
17Q | First examination report despatched |
Effective date: 20090202 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE ES FR GB |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SPINNLER, KLAUS Inventor name: DAUM, VOLKER Inventor name: ELTER, MATTHIAS Inventor name: COURONNE, ROBERT Inventor name: WINTER, CHRISTIAN Inventor name: RUPP, STEPHAN |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: COURONNE, ROBERT Inventor name: DAUM, VOLKER Inventor name: WINTER, CHRISTIAN Inventor name: ELTER, MATTHIAS Inventor name: SPINNLER, KLAUS Inventor name: RUPP, STEPHAN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20110524 |