WO2015157543A2 - Surveillance de combustion à quatre dimensions (4d) à l'aide de techniques tomographiques et endoscopiques - Google Patents

Surveillance de combustion à quatre dimensions (4d) à l'aide de techniques tomographiques et endoscopiques Download PDF

Info

Publication number
WO2015157543A2
WO2015157543A2 PCT/US2015/025160 US2015025160W WO2015157543A2 WO 2015157543 A2 WO2015157543 A2 WO 2015157543A2 US 2015025160 W US2015025160 W US 2015025160W WO 2015157543 A2 WO2015157543 A2 WO 2015157543A2
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate
coordinates
fiber based
projected
fbe
Prior art date
Application number
PCT/US2015/025160
Other languages
English (en)
Other versions
WO2015157543A3 (fr
Inventor
Lin Ma
Original Assignee
Virginia Tech Intellectual Properties, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virginia Tech Intellectual Properties, Inc. filed Critical Virginia Tech Intellectual Properties, Inc.
Publication of WO2015157543A2 publication Critical patent/WO2015157543A2/fr
Publication of WO2015157543A3 publication Critical patent/WO2015157543A3/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00057Operational features of endoscopes provided with means for testing or calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2415Stereoscopic endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00165Optical arrangements with light-conductive means, e.g. fibre optics
    • A61B1/00167Details of optical fibre bundles, e.g. shape or fibre distribution

Definitions

  • Non-intrusive optical diagnostics have been demonstrated as powerful tools for the study of flow and combustion processes.
  • obtaining images of fluids, and especially images of combustions such as flames has been a major challenge in the implementation of many optical diagnostic techniques in practical systems.
  • the challenge for optical access is further exacerbated when multidimensional measurements are desired, because multidimensional measurements typically require measurements to be taken at multiple locations or view angles.
  • the present invention concerns the application of fiber-based endoscopes (FBEs) in combustion or flow measurements, especially for multidimensional and quantitative measurements.
  • FBEs fiber-based endoscopes
  • the use of FBEs offer several unique advantages to greatly reduce the implementation difficulty and cost of optical diagnostics.
  • the present invention provides improved methods of registering the locations and orientations of the FBEs for quantitative measurements to prevent the degradation of the spatial resolution of the images transmitted.
  • the present invention provides a view registration process that is accurate within +0.5 0 and the FBEs can resolve spatial features on the order of 0.25 mm.
  • the present invention obtains instantaneous measurements of flame structures at kHz rate using an FBE with the capability of resolving flame features on the order of 0.2-0.3 mm in three-dimensions.
  • Figure 1 illustrates a view registration method of one embodiment of the present invention.
  • Figure 2 is a schematic of a setup for validating the view registration method of one embodiment of the present invention.
  • Figure 3a is an of example calibration results obtained with a calibration plate or object of one embodiment of the present invention.
  • Figure 3b is a close view of the re-projection error in the boxed region shown in Figure 3a.
  • Figures 4a and 4b show the accuracy of the calibration results in terms of ⁇ ⁇ .
  • Figures 5a and 5b show the accuracy of the calibration results in terms of ⁇ .
  • Figure 6 shows an exemplary FBE with four inputs and one output of one embodiment of the present invention.
  • Figure 7a provides an image of a test chart obtained with a camera.
  • Figure 7b provides an image of a test chart obtained with an exemplary FBE used in accordance with the present invention.
  • Figure 8 is an image of the test chart along the x direction obtained directly with a camera.
  • Figure 9 is a comparison of the image along the x direction obtained without and with the use of an FBE.
  • Figure 10 is a scatter plot of the re-projection error under various conditions.
  • Figures 1 la and 1 lb are histograms of the re-projection error in the x and y directions.
  • Figure 12a is a schematic of a setup with nine FBEs to obtain instantaneous 3D flame measurements of one embodiment of the present invention.
  • Figure 12b is an illustration of a holder for an FBE and a coordinate system using tomography analysis.
  • Figure 13 depicts nine simultaneous projections of a premixed flame obtained using nine FBEs and three cameras.
  • Figure 14a shows a 3D reconstruction of the flame and Figure 14b shows 2D cross- sectional views of the flame at three y locations.
  • Figure 15a is a cross-sectional view of the flame obtained from the 3D reconstruction.
  • Figure 15b shows flame contours determined from Figure 15a overlaid on top of line- of-sight chemiluminescence measurements.
  • Figure 16a is a 3D reconstruction of a flame with methane as the fuel and Figure 16b provides 2D cross-sectional views of the flame at three y locations.
  • the present invention provides one or more FBEs to circumvent the requirement for direct line-of-sight viewing and facilitates the capture of multiple projection measurements on the same camera or on one or more cameras. This reduces the implementation difficulty and cost of multidimensional measurements, especially in practical systems.
  • the use of FBEs requires registering the locations and orientations of the FBEs for quantitative measurements. Otherwise, the spatial resolution of the images degrades after the images are transmitted through the FBEs. Under proper conditions, in a preferred embodiment, the view registration process can be accurate within +0.5 0 and the FBEs can resolve spatial features on the order of 0.25 mm.
  • the present invention is able to make instantaneous 3D measurements of flame structures at kHz rates, with the ability to resolve sub-millimeter flame features.
  • an open source MATLAB tool was developed for the calibration of cameras for the view registration of multiple FBEs.
  • View registration is the process of determining the location and orientation parameters of FBEs in order to transform from a world coordinate system to an image coordinate system.
  • the view registration process is accomplished by evaluating three sets of coordinates: the 3D world coordinates, the 3D coordinates of the FBEs, and the 2D image coordinates.
  • the 3D world coordinates are first converted to 3D FBE coordinates by translation and rotation. Then, the 3D FBE coordinates are projected to the 2D image coordinates. Once the 2D coordinates of the image are obtained, they can be related back to the 3D world coordinates.
  • Figure 1 illustrates these concepts by considering the transformation of an arbitrary point M in the measurement domain through an FBE onto a camera.
  • a 3D world coordinate system O-XYZ 110 is established as shown.
  • the coordination of M in this system may be denoted as (x, y, z) ⁇
  • a coordinate system 112 is also established on one or more FBEs
  • the system on the first FBE 120 is denoted as O F l BE - X FBE Y FBE Z FBE ⁇
  • the signal emitted at point M propagates through the FBE and is imaged on the camera at point M' as shown.
  • a third 2D image coordinate system, Or X[Y[, is established on the image chip as shown.
  • the coordination of M' in the image coordinate system may be denoted to be (x y T ) .
  • O-XYZ is fixed, ( z , y T ) depends on the locations, orientations, and optical parameters of the FBEs and cameras, i.e., the relative relationship between the three coordinate systems. Therefore, the goal of the view registration analysis is to determine the relative relationships by measuring the projections (e.g., M') of a set of points with known coordination in the O-XYZ system (e.g., M) in the measurement domain.
  • extrinsic parameters define the location and orientation of the FBEs in the 3D world coordinates (i.e., O- XYZ). When multiple FBEs are used, these extrinsic parameters, once obtained, can also be used to directly calculate the relative locations and orientations between any two FBEs.
  • the transformation from the 3D FBE coordinates to the 2D image coordinates is dependent on the properties of the cameras, and thus the parameters involved in this step are categorized as intrinsic parameters.
  • the intrinsic parameters include the focal length vector ( / ), principal point vector ( c ), skew coefficient vector ( ), and distortions vector ( k ) of the lens system used on the camera. Both the extrinsic and intrinsic parameters are a consideration in order to quantify the transformation from the world coordinate system to the 2D image coordinates.
  • the view registration method analyzes the projection of M onto the 2D image as shown in Figure 1.
  • the transformation from the 3D FBE coordinate system to the 2D image coordinate system is illustrated by considering the projection of point M through an FBE, for example the first FBE denotes the coordination of M in 0 FBE - X FBE Y FBE Z FBE to be (x FBE , y FBE , z F l BE ) .
  • the projection of point M onto the 2D image plane is dependent on the internal intrinsic parameters (i.e., / , c, andk).
  • / is the 2x1 vector storing the focal length in pixels
  • c is the 2X1 vector storing the principal point coordinates
  • a is the skew coefficient for the angle between x and y pixel axis
  • k is the 5x1 vector storing the image distortion coefficients for both radial and tangential distortions. Decentering or imperfect centering of the lens components and other manufacturing defects in a compound lens are common causes of tangential distortion.
  • the final pixel coordinates (x l ,y l ) of the projection of M on the 2D image plane is also related to p d via calculated by the following equation: f(iy(p d ( ) + - Pd (2))
  • equations (1) through (4) complete the transformation from the 3D FBE coordinate system to the 2D image coordinate system by relating (x F l BE , y F l BE , z FBE ) and ( z , y ⁇ ) .
  • this transformation depends on intrinsic parameters, i.e.,
  • the rotation matrix and translation vector for the first FBE may be denoted to be Rj and Tj, and then the transformation of the coordination of point M (whose coordination is (J , y, z) in the O-XYZ system) is related to its coordination in the
  • the projection of a point (e.g., ) from the world coordinates to the 2D image coordinates depends on both the intrinsic and extrinsic parameters. Therefore, a goal of the view registration analysis is to determine these parameters by measuring the projections (e.g., M') of a set of points with known coordination in the O-XYZ system (e.g., ) in the measurement domain.
  • a calibration object which may be a calibration plate is fabricated with known patterns (a black and white chess board pattern for example) and installed with a known position and orientation in the measurement domain.
  • the calibration object is used in the coordination of each point on the plate in the O-XYZ system as shown in Figure 1.
  • a calibration plate as described above may be used.
  • other calibration objects that provide reference points may also be used.
  • the calibration object For measuring a fluid flow, a combustion such as a flame, the calibration object may be designed to match, mimic or be based on the general shape of the fluid shape and/or combustion to be imaged.
  • the calibration object may be circular in nature so as to mimic the configuration of the stream, with points of reference located around the circumference.
  • One example would be a calibration object that is cylindrical in nature, although other configurations may be used as well. In general, it is desirable to configure the calibration object so that it provides a reference point for each pixel.
  • the present invention obtains images of the plate for each FBE to be registered using the configuration shown in Figure 2.
  • the coordination of each point on calibration plate 200 was transformed into the image coordinate systems following Eqs. (1) through (5) by assuming a set of the intrinsic and extrinsic parameters.
  • the projected images obtained from such calculation were then compared against the measured images to determine the intrinsic and extrinsic parameters iteratively.
  • the pattern on the calibration plate such as the locations of the intersection points between the white and black squares, facilitates the comparison between the calculated and measured images. If the calculated and measured images agree within a preset error limit, then the iteration may be terminated and the view registration program outputs the assumed parameters as the results. Otherwise, an improved estimation of the parameters may be obtained based on an estimate, the degree of disagreement between the calculated and measured images, and the Jacobians of the equations involved in the transformation. The improved estimation is then used in the subsequent iteration.
  • FIG. 2 shows a setup schematically for performing image acquisition, which is denoted as step 100, view registration, which is denoted as step 102 and calibration of results and validation, which is denoted as step 104.
  • a calibration plate 200 with a chess-board pattern (evenly spaced black and white squares of size 6 mmX6 mm) may be used as the registration target, in the place of the actual flame or flow to be measured.
  • the plate was fixed on a rotational stage, so that the orientation of the plate relative to the 3D world coordinate system (O-XYZ) can be adjusted in a controlled way. This orientation was quantified by ⁇ , the angle formed between calibration plate and the OXY plane.
  • the angle ⁇ can be adjusted and measured by a digital level from -15 ° to 15 ° with an accuracy of 0.1 degree.
  • Both the calibration plate and the rotational stage were fixed on an optical bench.
  • Two FBEs 210 and 220 and their corresponding lenses 212 and 222 were mounted on a solid aluminum optical breadboard, which was mounted perpendicular to the same optical bench as the calibration plate.
  • one of the FBE and its corresponding lens were mounted on a rotational and translational stage 230, so that the position and orientation of this FEB relative to calibration plate and the other FBE can be adjusted in a controlled manner.
  • the relative angle between the two FBEs was quantified by ⁇ ⁇ as shown and experimentally, ⁇ ⁇ was determined by the readings on the rotation mount which had an accuracy of 0.05 degrees.
  • Each of the FBEs consisted of an array of 350X350 individual single mode fibers, resulting in a total of 122,500 image elements per FBE. Each individual fiber, or image element, has a 17 ⁇ core diameters and the overall length of the FBE is 1.35 m.
  • a right-angle prism mirror was used to combine the output from both FBEs onto the same CCD camera 250 (SensiCam with 1376x1040 pixels and pixel size of 6.45 ⁇ ).
  • the prism mirror had reflective coatings on both of its surfaces, and provided a clear aperture extending across its 90 ° angle between the coated surfaces.
  • a set of lenses may be used in front of the input of the FBEs and also on the CCD camera, and these lenses were designed in such a way that the signal from each individual fiber in the FBE was approximately collected by one pixel on the CCD camera, resulting in an approximate one to one correspondence between the image element of the FBE and the pixel of the CCD.
  • the combined images from both FBEs were then acquired by the CCD camera 250 at various angles of ⁇ ⁇ andA ⁇ . These images were then analyzed by the view registration algorithm described above to output the location and orientation of the FBEs. Two of the outputs from the view registration analysis were A0 x andA ⁇ p , which then can be compared to their experimentally measured value as discussed above to validate the view registration algorithm.
  • Figures 3a and 3b show a set of example calibration results for projecting the 3D world coordinates in the object domain (in this case, the calibration target) through an FBE onto a camera.
  • the image shown was the image acquired by the camera.
  • the crosses in the circles represent the detected intersection points of grey and black squares from the image.
  • the pattern of the grey and black squares was known a priori and was used as input for the calibration method.
  • the circles represent the re-projection of the same intersection points determined from the computed calibration parameters (output).
  • the difference between the cross and the circle represent the re-projection error.
  • the re-projection errors are typically less than two pixels, as shown in Figure 3b, which zooms in on the boxed region in Figure 3a.
  • the re-projection error was 0.6 pixels in both the x and y direction.
  • Figure 3a does not faithfully visualize the re-projection error because of its scaling and resolution.
  • the re-projection error depends on errors in the view registration process and also other factors such as the degradation of the spatial resolution.
  • Figures 4a and 4b and Figures 5a and 5b compare the angles measured experimentally versus those computed by a view registration program.
  • Figures 4a and 4b show the ⁇ ⁇ computed values when the relative orientations of the FBEs were adjusted.
  • the straight dashed line is a 45° line showing where the computed values would ideally match with the measured values.
  • Figure 4b also shows the residual of these results, defined as the difference between the measured value and the ideal match as shown in Figure 4a.
  • the residual is within ⁇ 0.5°.
  • Figures 5a and 5b show another set of validation results in terms of ⁇ while ⁇ ⁇ is fixed at 20.13°. In this case, the angle ⁇ was adjusted and its value measured. Figures 5a and 5b compare the measured values with those computed from the view registration program.
  • Spatial resolution defines how sharply features in an image can be resolved and is an aspect for imaging measurements in flows and flames.
  • Optical fiber bundles are known to degrade and distort images.
  • sub-millimeter spatial resolution is possible with FBEs.
  • distortion can be accounted for with an accuracy of +2 pixels in the operation range.
  • FBE 600 has four input bundles 602- 605 as shown and all four input bundles were combined into one output head 610.
  • the purpose of FBE 600 is to allow the registration of four images from the four input bundles on one camera chip without the need of additional optics (e.g., the prism mirror used in Figure 2).
  • One such measurement is shown in Figure 7b.
  • FBE 600 has an overall length of 2 m and a diameter of 2 cm.
  • FBE 600 also has an array of 470x470 single mode optical fibers in each input bundle, resulting in a total of 883,600 (470x470x4) imaging elements from the output end.
  • the view registration of FBE 600 was studied using the setup shown in Figure 2 (with two input bundles at a time), and results with the same accuracy were obtained as those described already.
  • Figures 7a and 7b shows a set of images of the test chart with and without the use of an
  • FIG. 7a shows the images acquired by the camera directly without the use of an FBE
  • Figure 7b shows the images of the test chart captured using FBE 600.
  • each view from each FBE on the camera occupied a region of about 400x400 pixels because the imaging optics were designed such that the signal from each individual fiber was projected approximately to one pixel.
  • Figure 7a the boxed region on the test chart as shown in Figure 7a (i.e., Group 1-Element 1 on the test chart) was reviewed.
  • the pattern in this region consists of vertical black bars with a width of 0.25 mm separated from each other by 0.25 mm.
  • Figure 8 shows the signal in Figure 7a, again captured directly with the camera, along the x direction, illustrating that the 0.25 mm feature can be sharply resolved.
  • Figure 9 compares the measurements shown in Figure 8 (captured directly with the camera) with the measurements transmitted through an FBE to the camera.
  • the data with FBE in Figure 9 are the same as those shown in Figure 8, and the data without FBE were taken from view 1 shown in Figure 7b by reading the image of Group 1 -Element 1 pixel by pixel.
  • FBEs (together with the same imaging optics used here) may be used to resolve flame features on the order 0.2 or 0.3 mm.
  • Re-projection error is defined as the difference between the image recreated using the parameters determined through the view registration process and the original image. Such quantification is facilitated by using test targets with known patterns like the chess board pattern shown in Figure 3 or the test chart shown in Figures 7a and 7b. Because the captured images are used as inputs in the view registration process and the spatial resolution of the captured images is already degraded, the re-projection error represents a combined and overall error.
  • Figure 10 summarizes the re-projection errors. Different symbols represent tests performed under different conditions. The re-projection error was determined by calculating the difference of a feature (such as the distance between the edges of a black bar) between its known value and the value obtained from the re-projected image.
  • the re-projection error was within ⁇ 1.25 pixels in the x direction and within ⁇ 2.0 pixels in the y direction.
  • the rectangular box in the center of Figure 10 illustrates the regime where error in both directions is within ⁇ 0.5 pixel (e.g., 1 pixel of absolute error), which encompassed the dominant majority of the data points.
  • Figures 11a and l ib show the histograms of the distribution of the re-projection error obtained from Figure 10, and confirm that the dominant majority (-94%) of the error occurs within ⁇ 0.5 pixel in both the x and y directions.
  • each pixel corresponds to about 0.05 mm of physical size. Therefore an absolute re -projection error of 1 pixel results in an uncertainty of 0.05 mm.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • Figure 12a shows a schematic of an embodiment of the present invention configured to image a flow 1200, which for illustrative purposes is a flame. However, it should be understood that other fluid flows may be imaged using the teachings of the present invention as well.
  • Figure 12b shows the details of a holder designed to align the FBEs and illustrates the coordinate systems.
  • the target fluid is a flow of flame 1200 in this example, it is not stable and the projection measurements need to be made from various orientations simultaneously and instantaneously.
  • the present provides a system having 9 FBEs 1211-1219 that are spaced around fluid 1200. In other embodiments, more FBEs and cameras may be used and spaced around flame 1200 or fewer numbers may be used as well. In a preferred embodiment, the FBEs are equally spaced around the entire circumference of a fluid stream to be imaged.
  • the cameras are connected to one or more CMOS cameras 1223-1225 (Photron SA4) as shown in Figure 12a to accomplish projection measurements.
  • the projections measured by FBE 1211 through 1214 and 1215 through 1218 were captured on the first and second CMOS cameras 1223 and 1224, respectively; and that measured by FBE 1219 was captured by the third CMOS camera 1225.
  • Figure 13 shows a set of example projections taken by FBEs 1211-1219 and three cameras 1222-1225. These measurements were taken on a premixed propane-oxygen flame, stabilized by a 2mm rod on a 5 mm diameter jet, with an exposure time of 0.5 ms and a frame rate of 2,000 frames per second.
  • the fuel flow rate was 1.1 SLPM (standard liter per minute), and oxygen flow rate was 2.44 SLPM, resulting in a Reynolds number of 1,190 based on the jet diameter.
  • the 3D world coordinate system ( ⁇ - ⁇ ) was defined in such a way that the X-axis was perpendicular to the rod, the F-axis was along the rod, the Z-axis was along the direction of the flow, and the origin O was fixed at the center of the jet exit. All the cameras were synchronized by a network router 1300, and the control and data acquisition were centralized by a computer 1310.
  • Figure 12b shows a holder 1350 designed to align the FBEs and also further illustrates the coordinate systems.
  • the holder includes an optical rail that was used to align a lens with an FBE.
  • the orientation and location of the lens and FBE combination can be defined by a distance (r) and two angles ( ⁇ and ⁇ ) as shown in Figure 12a and Figure 12 b, where r specifies the distance between O and the center of the lens, # specifies the angle formed by the optical axis of the lens-FBE combination relative to the Y axis, and specifies the angle formed by the optical axis of the lens-FBE combination relative to the Z axis.
  • the optical axis of the lens-FBE does not have to be aligned to pass O as shown in Figure 12b.
  • Figure 12b shows it this way only to avoid clustering the plot.
  • the FBEs are bendable therefore circumventing the need for direct- line-of- sight as required in the use of cameras. They allow the capture of measurements from multiple locations and the views may be captured by a single camera.
  • Advantages of this embodiment include reducing the requirement on optical access, physical space, and hardware cost compared to tomographic implementation using cameras directly.
  • the present invention is capable of obtaining images from arbitrary view angles (rather than view angles dictated by line-of-sight or physical space), which have been shown to be advantageous to enhance the fidelity of tomographic reconstructions.
  • each FBE input i.e., r, ⁇ and ⁇
  • Table 1 lists the values of r, ⁇ and of the 9 flame projections shown in Figure 13 determined from the view registration method.
  • the 3D imaging of flame topography is based on CH .
  • the 3D instantaneous concentration of CH is denoted as F(x,y,z), and discretized F into cubic voxels in the O-XYZ coordinate system.
  • the projection (P) of F obtained from a FBE is a 2D line-of-sight-integrated image of F on the camera. The relationship between P and F is shown in Eq. (6) below:
  • the PSF does not depend on the specific physical processes involved (e.g., chemiluminescence, emission, or droplet scattering).
  • the PSF depends on the image system and its position and orientation relative to O-XYZ. Therefore, equations (6) and (7) can be applied to other types of optical tomography measurements based on other signal generation mechanisms. Numerous algorithms have been proposed to solve Eq. (7).
  • An algorithm based on simulated annealing code named TISA, Tomographic Inversion by Simulated Annealing
  • the algorithm has been demonstrated to solve other types of inversion problems in combustion measurements.
  • Figure 14a shows the 3D iso-surface with the highest CH concentration obtained from the tomographic reconstruction to visualize the flame structures.
  • Figure 14b shows 2D slices of the CH* concentration from the reconstruction at three different y locations. Note that the volume imaged by the FBEs had an approximate dimension of 9.5x9.5x9.5 mm . However, flame only existed in portions of this volume, as reflected by some black regions as seen in Figure 13 where no chemiluminescence signals are present. These black regions were cropped off before the tomographic reconstruction to reduce the computation cost. After the cropping, the effective measurement volume had a dimension of approximately 9.5x5x6 mm , as shown in Figure 14a.
  • the open-tip feature combined with the use of the rod created 4 distinct flame fronts, as seen from the view provided by the fourth FBE as shown in Figure 13.
  • the 3D reconstructions shown in Figure 14a captured all four flame fronts as labeled accurately, including the size and position of each flame front. Also, the thickness of each flame front was estimated to be 0.25 mm based on the axial view provided by the fourth FBE as shown in Figure 13, and all the flame fronts were resolved as shown in Figure 14b, consistent with the validation of the FBE's ability to resolve features on the order of 0.25 mm.
  • Figures 15a and 15b compare the flame fronts determined from the 3D reconstruction against those determined by the traditional line-of-sight chemiluminescence measurements.
  • Figure 15b overlays the contours of the flame front (shown as black line) determined from Figure 15a on top of the flame projection from the fourth FBE 1214 as shown in Figure 13. This comparison shows that the 3D reconstruction based on FBE measurements correctly captured the shape and size of the flame both qualitatively and quantitatively, demonstrating the feasibility of using FBEs for instantaneous and high-speed imaging measurements.
  • Figures 16a and 16b show another set of 3D measurements using the FBEs on a methane-oxygen premixed flame. Similar to the propane flame discussed above, this flame was also created by placing a rod (with a diameter of 3.125mm) as the stabilizing bluff body at the exit of a jet (of a diameter of 8 mm). The flame shown had a flow rate of 4.5 and 4.7 SLPM for methane and oxygen respectively, resulting in a Reynolds number of 2,920. Due to the different fuel and burner configuration, the flame generated exhibited close-tips and an overall V-shape with two flame fronts. The overall dimension of the probed volume was 14x14x6 mm .
  • Figure 16a shows the 3D iso-surfaces of the highest CH* concentration from the 3D reconstruction to visualize the flame, showing the overall V-shape of the flame and the two flame fronts.
  • Figure 16b shows three 2D slices taken out of the 3D reconstruction at three different y locations. The thickness of the flame fronts was estimated to be around 0.30 mm based on the line-of-sight chemiluminescence images, which was clearly resolved by the 3D reconstructions shown in Figure 16b.
  • the present invention provides a method of displaying a three dimensional object in a four-dimensional presentation using one or more fiber based endoscopes (FBE), comprising: providing a calibration object having a plurality of reference points with known coordinates; for each FBE, imaging the plurality of reference points located on the calibration object to generate re -projected coordinates representing images of the reference points; comparing the coordinates of the reference points of the calibration object with the re- projected coordinates; and based on the comparison, adjusting the re-projected coordinates to create a calibration image to create an accurate four-dimensional presentation of the three dimensional object.
  • the comparison may be performed pixel by pixel with an accuracy of better than 1 pixel for the calibration image.
  • one or more calibration images may be simultaneously processed or all of the calibration images may be simultaneously processed. Simultaneous processing of the images may be performed by using a rotation matrix and translation vector for each FBE. The method may be performed by using an algorithm using Equations 1-5 as described above.
  • the numerical aperture of the FBE matches the numerical aperture of an imaging lens connected to the FBE to maximize the signal level and image quality.
  • the optical transmission of the FBEs may also be improved by using a length designed to minimize transmission losses.
  • the present invention provides a system for imaging combustion comprising: a plurality of fiber based endoscopes; a calibration object having a plurality of reference points; and computer software code stored on a computer readable medium and configured for execution by a processor adapted to generate re-projected coordinates representing images of the reference points obtained by the fiber based endoscopes, comparing the coordinates of the reference points of the calibration object with the re-projected coordinates; and based on the comparison, adjusting the re-projected coordinates of a combustion to create a calibration image of the combustion.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Des modes de réalisation donnés à titre d'exemple se rapportent à des systèmes et à des procédés d'imagerie d'un objet, d'une combustion ou d'un fluide. Dans un procédé pour obtenir une image tridimensionnelle d'un objet à l'aide d'un ou de plusieurs endoscopes à base de fibres, on utilise un objet d'étalonnage ayant une pluralité de points de référence ayant des coordonnées connues. Les points de référence situés sur l'objet d'étalonnage sont mis en image afin de générer des coordonnées re-projetées représentant des images desdits points de référence. Les coordonnées des points de référence sont comparées aux coordonnées re-projetées puis sont ajustées pour créer une image précise d'un objet.
PCT/US2015/025160 2014-04-09 2015-04-09 Surveillance de combustion à quatre dimensions (4d) à l'aide de techniques tomographiques et endoscopiques WO2015157543A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461977156P 2014-04-09 2014-04-09
US61/977,156 2014-04-09

Publications (2)

Publication Number Publication Date
WO2015157543A2 true WO2015157543A2 (fr) 2015-10-15
WO2015157543A3 WO2015157543A3 (fr) 2016-01-14

Family

ID=54288544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/025160 WO2015157543A2 (fr) 2014-04-09 2015-04-09 Surveillance de combustion à quatre dimensions (4d) à l'aide de techniques tomographiques et endoscopiques

Country Status (1)

Country Link
WO (1) WO2015157543A2 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6009189A (en) * 1996-08-16 1999-12-28 Schaack; David F. Apparatus and method for making accurate three-dimensional size measurements of inaccessible objects
US7889905B2 (en) * 2005-05-23 2011-02-15 The Penn State Research Foundation Fast 3D-2D image registration method with application to continuously guided endoscopy
WO2012156873A1 (fr) * 2011-05-18 2012-11-22 Koninklijke Philips Electronics N.V. Correction de segmentation endoscopique pour recouvrement d'images 3d-2d

Also Published As

Publication number Publication date
WO2015157543A3 (fr) 2016-01-14

Similar Documents

Publication Publication Date Title
Kang et al. Fiber-based endoscopes for 3D combustion measurements: view registration and spatial resolution
CN103649674B (zh) 测量设备以及信息处理设备
JP4873485B2 (ja) 多数の基準面を用いた形状計測方法および形状計測装置
WO2013136620A1 (fr) Procédé et dispositif d'analyse de distribution de phase pour image de franges à l'aide d'informations de luminosité de dimension élevée, et programme associé
JP2016128816A (ja) プレノプティック・カメラを使った表面属性の推定
CN109767425B (zh) 机器视觉光源均匀性评估装置及方法
Su et al. Refractive three-dimensional reconstruction for underwater stereo digital image correlation
Huang High precision optical surface metrology using deflectometry
Wang et al. Towards self-calibrated lens metrology by differentiable refractive deflectometry
Traxler et al. Experimental comparison of optical inline 3D measurement and inspection systems
Rachakonda et al. Sources of errors in structured light 3D scanners
Sun et al. Lens distortion correction for improving measurement accuracy of digital image correlation
JP5645963B2 (ja) 3次元フロー測定のための光学イメージング関数の組を求める方法
Hinz et al. Metal forming tool monitoring based on a 3d measuring endoscope using cad assisted registration
TWI553291B (zh) 利用條紋投影量測透明物體的系統
WO2015157543A2 (fr) Surveillance de combustion à quatre dimensions (4d) à l'aide de techniques tomographiques et endoscopiques
TWI739851B (zh) 測量光學檢驗物件之摩爾紋的裝置
Zhu et al. Three-dimensional measurement of fringe projection based on the camera response function of the polarization system
CA2955391A1 (fr) Procede et appareil pour mesurer des systemes et surfaces optiques par metrologie de rayonnement optique
Sheng Precise measurement for line structure light vision sensor with large range
JP2012150018A (ja) 形状計測方法
MacKinnon et al. Lateral resolution challenges for triangulation-based three-dimensional imaging systems
JP2007315865A (ja) 三次元変位量測定器および測定方法
JP5843179B1 (ja) 検査装置、及び波面収差補正方法
JP2020060480A (ja) 偏心量測定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15776451

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15776451

Country of ref document: EP

Kind code of ref document: A2