WO2002025209A1 - Lidar avec systeme d'imagerie a tube a balayage de fente - Google Patents

Lidar avec systeme d'imagerie a tube a balayage de fente Download PDF

Info

Publication number
WO2002025209A1
WO2002025209A1 PCT/US2000/024098 US0024098W WO0225209A1 WO 2002025209 A1 WO2002025209 A1 WO 2002025209A1 US 0024098 W US0024098 W US 0024098W WO 0225209 A1 WO0225209 A1 WO 0225209A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
fan
lidar
craft
water
Prior art date
Application number
PCT/US2000/024098
Other languages
English (en)
Inventor
Stephen C. Lubard
John W. Mclean
David N. Sitter, Jr.
J. Kent Bowker
Antohny D. Gleckler
Original Assignee
Arete Associates
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arete Associates filed Critical Arete Associates
Priority to AU2001229026A priority Critical patent/AU2001229026A1/en
Priority to PCT/US2000/024098 priority patent/WO2002025209A1/fr
Publication of WO2002025209A1 publication Critical patent/WO2002025209A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver

Definitions

  • This invention relates generally to optoelectronic systems for imaging objects from an elevated or slightly elevated observing instrument.
  • imaging systems in- elude but are not limited to mast-mounted systems for ob- taining warning of shallow hazards ahead of a water craft, aircraft-carrier landing aids, and refinements in airborne imaging platforms.
  • a related aspect of the invention pro- vides intensity equalization across a fan-shaped probe beam, and has general industrial applications.
  • Shallow-angle marine observation systems A particu- lar difficulty of all marine observational systems, even visual systems, is the problem of interference by the water surface. Reflections at the surface, whether of ambient radiation or of probe beams, tend to be confused with signals or signatures of the hazards or other objects of interest. Another noteworthy problem with such systems is the limited range of known apparatus and methods. In the past, short range has been seen as essentially an inherent limi- tation of mast-mounted or other only-slightly-elevated equipment. It is known to use light detection and ranging ( ⁇ , LI- DAR") for such purposes. Fig.
  • the Anderson paper describes use of grazing-incidence LIDAR for detection of shallow objects.
  • the group detected a target of diameter about 80 centime- ters (2H feet) , to depths of nearly 5 meters (15 feet) at a range of 130 meters (400 feet) from a pier.
  • the experimental demonstration used a narrow-beam LI- DAR and a photomultiplier-tube detector.
  • the laser L (Fig. 1) and receiver R were mounted in a hut-like enclosure E on a pier structure S in the ocean, at distance F of about 330 m (1100 feet) forward from the beach.
  • the LIDAR transceiver L-R was at a height H of about 13 m (40 feet) above the ocean surface O.
  • Ulich et al use a streak tube for time-resolved fluorescence (wavelength vs. time), not imaging (angle vs. time) .
  • their text particularly cites use of a streak tube in a nonimaging mode.
  • they use a laser blocking filter to specifically reject the in-band response.
  • Utilization of a slit by the Ulich group is for spectral dispersion, not imaging.
  • the '589 Ulich patent, "Imaging LIDAR System” makes one reference to a ship-based application, but does not de- velop the idea further. The system is described only with reference to gated, intensified cameras.
  • Airborne-hazard alert for water craft LIDAR is al- so usable for obtaining information about airborne objects, whether threatening hostile objects or otherwise.
  • a sepa- rate system for such purposes is costly and occu- pies significant space in the command center of a water craft.
  • Aircraft-carrier operations In addition to detec- tion of floating and airborne obstacles (e. g. mines and other hazards) , another marine-related problem that would benefit from visibility aids is that of aircraft-carrier landing. This problem is particularly acute at night, and in fog or other turbid-atmosphere conditions. The difficulty of such operations is compounded by the high speeds involved, the fact that not only the aircraft but also the carrier is in motion. A further complication sometimes is the need for a degree of discreet or covert character in the traffic. Radio guidance may be of limited practicality in such circumstances.
  • Airborne surveillance Still another use of LIDAR systems that has been developed heretofore is airborne surveillance of objects submerged in the ocean or in other bodies of water.
  • U. S. Patent 5,467,122 commonly owned with the present document — sets forth many details of a surveillance system that is particularly aimed at monitor- ing relatively large areas of the ocean. In that system, typically imaging is limited to detec- tion from altitudes of at least 160 m (500 feet) and look- ing straight down into the water with the center of the probe beam. Still, there is some off-axis detection for positions well away from the track of the airborne plat- form.
  • Wave noise, and distortion represent one of the severest limita- tions for airborne surveillance, even in the clearest ocean waters. These concerns have not been adequately addressed with existing airborne LIDAR systems. According to a com- parative-evaluation field test in 1997, object-classifica- tion capability and the ability to reject false alarms in hazard detection have yet to be achieved to the satisfac- tion of the United States government. Both the shapes and the positions of submerged objects are distorted by uncorrelated refractions of different parts of the probe/return beam, due to irregularity of the water surface. Heretofore no effort has been directed to overcoming either the positional error or the relative vagueness of object shapes obtained with this technology.
  • Such systems resort to a sta- tistieal model, based on a single estimate of sea state, to estimate how many passes over the same patch of water are necessary to assure proper coverage.
  • This model is hard to validate — and the estimate of sea state may or may not be accurate or timely. Errors in the sea-state estimate force present systems to make either too many passes over the same area, which results in poor effective area-coverage rates, or too few passes, which may leave the area inadequately sampled and so unsafe for ship transit.
  • the propagation distance at each end of the fan is greater than that at the center by a factor equal to the secant of the fan half-angle.
  • the beam divergence over this greater propagation distance proportionately reduces the beam brightness at the water surface — and the return reflection must also travel farther, additionally aggravat- ing the brightness reduction at the detector.
  • the added travel distance also increases the return time at the fan-beam extremes . This delay, however, is wholly geometrical and therefore readily compensated in software; the accompanying bright- ness reduction cannot be resolved so easily.
  • the added return distance may produce either anoth- er factor of the secant, or instead another factor of the square of the secant.
  • the incident beam angles in all directions should be approximately main- tained in the return light and this implies that the same proportional decrease in energy should develop again.
  • the distribution of energy in the reflection process should be omnidirectional. If it were distributed equally in all directions (not usual, but only a limiting case that can help to understand likely actual behavior) , then the spatial distribution would fol- low a familiar inverse-square law — leading to attenuation of light from the fan-beam extremes by the square of the secant.
  • a laser beam generally varies in beam position, as well as energy distribution, from pulse to pulse .
  • the positional wandering takes the center of the beam off the center of the reassembled double-half-lens struc- ture described above.
  • the geometrical center of the beam Preferably not simply the geometrical center of the beam but rather the effective center, in terms of maximum energy flux (or in terms of optimized energy flux over the entire fan-beam span) should be centered upon the reassem- bled lens structure.
  • alignment becomes a matter of attempting to place the drifting effective center — of a beam of inhomogeneous and varying energy distribution, and varying position too — at the centerline of the reassembled lens structure. This is challenging.
  • the present invention introduces such refinement, and thus importantly advances the art.
  • the invention has sev- eral facets , or aspects , that are capable of use indepen- dently. For greatest enjoyment of their benefits and ad- vantages, however, all or several of these facets are best employed in combination together.
  • the invention is a system for detecting objects from an elevated position.
  • the system includes a LIDAR subsystem, mounted at such a position.
  • the LIDAR subsystem emits thin fan-beam light pulses at a shallow angle, and detects reflected portions of the fan-beam pulses at a like shallow angle.
  • the system also includes a streak-tube subsystem for imaging successive reflected fan-beam pulse portions.
  • the combination of streak tube with pulsed fan beam offers imaging capabilities and spatial resolution far in advance of all shallow-angle systems known heretofore.
  • the invention is preferably prac- ticed in conjunction with additional features or character- istics.
  • the shallow angle is a vertical angle. The vertical angle may be defined either relative to the horizontal or relative to such a craft or its path.
  • the streak-tube subsystem image successive reflected fan-beam pulse portions at cor- responding successive positions on a display screen.
  • the streak-tube system thereby forms on the screen a represen- tation of such objects as a function of distance from the craft .
  • the system includes a mast or high bridge on such craft, for providing such elevated position for mounting of the LIDAR subsystem.
  • the system also includes such craft itself.
  • the shallow angle ap- proximate grazing incidence with a water surface near the craft is that the shallow angle ap- proximate grazing incidence with a water surface near the craft.
  • the thin fan beam illumi- nates a swath on the order of sixty centimeters (two feet) wide, measured generally in the propagation direction along the water surface.
  • the shallow angle is in a range of approxi- mately one to fifteen degrees. Still more preferably the shallow angle is in a range of approximately two to ten degrees — and ideally it is roughly five degrees.
  • the thin fan beam be on the order of 2.5 centimeters (one inch) thick.
  • the system further include some means for applying a compensation for reduced energy near lateral ends of the fan beam.
  • the compen- sation-applying means include a lenslet array or other spa- tially variable amplitude compensator for variations due to the fan-beam propagation distances — in conjunction with the inverse radial dependence of energy density in the di- verging beam.
  • a lenslet array is preferable as it can render the fan angle substantially independent of in- put-beam position and size; and such an array also tends to homogenize the fan beam.
  • the invention is a system for detecting ob- jects near a water craft.
  • the system includes a LIDAR sub- system, mounted to the water craft at an elevated position.
  • the LIDAR subsystem is for emitting thin fan-beam light pulses at a shallow angle, and for detecting reflec- ted portions of the fan-beam pulses at a like shallow an- gle.
  • the system includes some means for imag- ing successive reflected fan-beam pulse portions. For pur- poses of breadth and generality in discussing this second aspect of the invention, these means will be called simply the "imaging means".
  • the imaging means in- elude some means for imaging successive reflected fan-beam pulse portions at corresponding successive positions on a display screen.
  • the imaging means form on the screen a representation of such objects as a function of distance from the water craft. It is particu- larly preferable in this case, when the system is used with a craft that is in motion, that the imaging means further include means for scrolling the successive lines generally synchronously with such motion.
  • Other important preferences include those mentioned earlier for the first independent aspect of the invention, relating to inclusion of a mast or high bridge, a water craft, specific shallow-angle ranges, and specific angles and beam dimensions, etc.
  • the invention is a system for detecting objects near a water craft.
  • the system includes some means for emitting thin fan-beam light pulses at a shallow angle, and for detecting reflected portions of the fan-beam pulses at a like shallow angle. Again for breadth and generality, these means will be called simply the "emitting and detecting means".
  • the emitting and detecting means are mounted to the water craft at an elevated position.
  • the system also includes a streak-tube subsystem for imaging successive reflected fan-beam pulse portions.
  • the emitting and detecting means are not necessarily a LIDAR subsystem as such. Nevertheless the application of thin fan-beam pulses projected at a shallow angle — and also detected at a like angle — provides enhanced geometry with improved range and depth resolution in object detection for water craft.
  • the third major aspect of the invention thus significantly advances the art, nevertheless to optimize enjoyment of its benefits the invention is preferably prac- ticed in conjunction with certain additional features or characteristics .
  • certain preferences out- lined for the first two facets of the invention are appli- cable here as well — including for instance use of a display screen to image successive reflected fan-beam pulse portions at corresponding successive positions on the screen, use of a mast or high bridge for the apparatus, and operation at grazing incidence with a water surface.
  • the invention is a system for detecting objects near a water craft.
  • the system includes a LIDAR subsystem, mounted to such a craft, for emitting thin fan-beam light pulses toward such objects and for de- tecting reflected portions of the fan-beam pulses.
  • some means for imaging successive reflected fan-beam pulse portions Again for generality these means will be called simply the "imaging means”.
  • the imaging means perform this function in a way that tightly localizes reflection from a water surface near such objects. In this way the imaging means facilitate detection of such objects despite proximity to the water surface.
  • the fourth aspect or facet of the invention may represent a description or defini- tion of the fourth aspect or facet of the invention in its broadest or most general form. Even as couched in these broad terms, however, it can be seen that this facet of the invention importantly advances the art. In particular, whereas all the aspects of the inven- tion have favorable performance in regard to the problem of surface-reflection interferences, this fourth facet of the invention is particularly addressed to that problem. By tightly localizing surface reflection, the fourth indepen- dent aspect of the invention enables discrimination between return signals due to that reflection and signatures of the objects of interest. Although the fourth major aspect of the invention thus significantly advances the art, nevertheless to optimize enjoyment of its benefits it is preferable to practice the invention with additional features or characteristics.
  • the imaging means also include some means for displaying successive reflected pulse-portion images at corresponding successive different portions of a display screen.
  • the imaging means image the surface reflection from water, near such objects, in a narrow range of closely adjacent portions of the screen.
  • the invention is a system for detect- ing objects submerged, or partially submerged, relative to a water surface.
  • This system includes a LIDAR subsystem for emitting thin fan-beam light pulses from above such water surface toward such objects and for detecting reflec- ted portions of the fan-beam pulses.
  • the system includes some means for analyz- ing the reflected pulse portions to determine water-surface orientations.
  • these means operate to derive, from these water-surface orien- tations , submerged-object images corrected for refractive distortion.
  • the invention is prac- ticed invention with additional features or characteris- tics.
  • the analyzing means in- elude some means for using precise range resolution of the reflected pulse portions to determine the water-surface orientations.
  • the invention can also be made greatly superior to imaging as taught be the previously mentioned '493 patent, which employs iterations at a different stage in the imaging process — namely, in the finding of a sur- face map.
  • the present invention is much faster, sufficiently fast in fact that it can be operated in real time from a fixed-wing aircraft overflying an ocean region .
  • the analyzing means include some means for applying Snell' s Law — in conjunction with the determined water-surface orientations — to develop corrections for refraction at such water surface. In this way the invention operates directly and straightforwardly on information in the reflected pulses to determine the de- sired details about submerged or partly submerged objects.
  • the LIDAR subsystem preferably includes a deflection device that sweeps a succession of the thin fan-beam light pulses across the objects and the water surface rapidly.
  • the deflection may operate starting generally at the horizon and sweeping downward to some shallow downward an- gle looking into the water, and then returning to the ini- tial generally horizontal position to sweep again.
  • the rapidity of pulsing and of sweep are preferably rapid enough to substantially capture all the water-surface orientations in a consistent common configuration. Ordinarily this calls for completing the entire sweep within a moderately small fraction of a sec- ond, such as for example roughly one five-hundredth to one hundredth of a second.
  • deflection is achieved by movement of an aircraft that carries the LIDAR subsystem bodily along, above the water surface.
  • the center of the fan beam preferably is directed vertically toward the water, and the narrow dimension of the beam spans a distance on the order of one meter to a few meters at the water surface .
  • the consistent common configuration of the water surface holds only over that distance.
  • the sweep in this case is not cyclical but rather continuous, moving progressively for- ward with the position of the aircraft over the water.
  • the rapidity of pulsing and the velocity of the aircraft are preferably selected to provide a generally continuous advance with nearly consis- tent common configuration of the water surface in adjacent or overlapping measurement swaths.
  • the data-analysis sys- tem then preferably sorts out the progressive movement of the surface itself from the progression of data in the LI- DAR snapshots.
  • a larger fraction of the water sur- face is disposed for specular reflection of the LIDAR beam. In airborne surveillance the glint problem is therefore ordinarily more severe.
  • a sixth major independent aspect or facet of the in- vention is related to the fifth.
  • the sixth is a method of putting into operation a LIDAR system that corrects for re- fraction in LIDAR imaging through waves in a water surface.
  • the method includes the step of defining simulated images of submerged objects as seen through waves in a water surface with a LIDAR system. It also includes the step of preparing an algorithm for applying a three-dimen- sional image of the water surface in refractive correction of LIDAR imaging through waves .
  • the method also includes the step of modeling applica- tion of the algorithm to the simulated images — using an assumed or actual three-dimensional image of the surface.
  • This step is conducted in such a way as to determine re- quirements of range and pixel resolution for successful operation of the LIDAR system.
  • Another step based upon the determined range and pixel resolution requirements, is preparing optics for the LIDAR system.
  • the sixth facet of the invention as thus generally defined has the advantage of eliminating itera- tion from the hardware-specification stage — the needed resolution is built into the hardware the first time. Nevertheless this facet of the invention is advanta- geously performed incorporating certain preferences.
  • the modeling is performed using a broad range of simulated images — in particular, simulated images pre- pared using a broad variation of assumed water-surface and atmospheric conditions, as well as assumptions about the submerged objects.
  • Another preference is preparing a second algorithm for capturing a three-dimensional image of the water surface based on ranging data obtained with a LIDAR system over a generally horizontal grid of positions, and modeling appli- cation of the second algorithm to actual ranging data ob- tained with a LIDAR system.
  • This preference operates to verify adequate performance of the second algorithm as to the critically needed resolution in the ranging direction and in horizontal grid directions.
  • the invention is a system for detecting objects from an elevated position.
  • the system includes a scanning-spot LIDAR subsystem, mounted at such a position. This subsystem performs three functions: (1) emitting a series of narrow light pulses at a shallow angle and successively displaced in an arc, (2) repeating the emitting to form successive arcs, and (3) detecting reflec- ted portions of the pulses at a like shallow angle.
  • the system includes a streak-tube subsys- tem for imaging reflected pulse patterns from the successive- sive arcs. The streak-tube subsystem images the reflected pulse patterns as successive lines on a display screen.
  • the seventh major aspect of the invention thus significantly advances the art, nevertheless to opti- mize enjoyment of its benefits it is preferable to practice the invention with additional features or characteristics.
  • the streak-tube subsys- tem further comprise some means for scrolling the successive- sive lines generally synchronously with such motion.
  • the invention is a system for de- tecting small exposed objects such as floating debris, at ranges on the order of tens of kilometers. This system includes a LIDAR subsystem.
  • the LIDAR sub- system is for performing two f nctions: (1) emitting near- ly horizontal, thin fan-beam light pulses to illuminate such exposed objects at ranges on the order of tens of ki- lometers, and (2) detecting nearly horizontal reflected portions of the fan-beam pulses returned from such exposed objects.
  • the system also includes a streak-tube subsystem for imaging successive reflected fan-beam pulse portions.
  • the shallow-angle or "nearly horizontal" condition imposes extreme demands upon the time-resolution capability of any system that is called upon to discriminate between various objects, or between such objects and a surface return.
  • this facet of the invention exploits the extra- ordinary time-resolution capability of the streak tube to obtain hitherto unheard-of range performance — despite the nearly horizontal projection and recovery of the fan beam.
  • the eighth major aspect of the invention thus significantly advances the art, nevertheless to optimize enjoyment of its benefits preferably the invention is prac- ti ⁇ ed with additional features or characteristics.
  • the light pulses are made eye-safe.
  • the light pulses are in the near infrared.
  • One ideal wavelength is approximately 1.54 microns .
  • the invention is a landing-aid system for use in facilitating aircraft landings on an aircraft carrier.
  • the system includes a LIDAR subsystem, mounted to such a carrier, for performing two functions: (1) emitting light pulses to illuminate such aircraft, and (2) detecting reflected portions of the pulses returned from such air- craft.
  • This system also includes a streak-tube subsystem for imaging successive reflected pulse portions.
  • the invention is an integrated system for detecting objects submerged, or partially submerged, rela- tive to a water surface near a water craft — and also for detecting airborne objects.
  • This system includes a LIDAR subsystem, mounted to the craft, emitting thin fan-beam light pulses from above the water surface and for detecting reflected portions of the fan-beam pulses. The pulses are emitted both toward submerged, or par- tially submerged, objects and also toward airborne objects exclusively.
  • the integrated system also includes dual means for analyzing the reflected and detected portions of the pulses.
  • the dual means include, first, some means for analyz- ing returns of pulse portions emitted toward submerged, or partially submerged, objects. This analysis determines water-surface orientations, and therefrom derives sub- merged-object images corrected for refractive distortion.
  • the dual analyzing means include, second, some means for separately analyzing the reflected and detected por- tions of pulses emitted toward the airborne objects exclu- sively. This second part of the analysis derives airborne- object images.
  • the foregoing may represent a description or defini- tion of the tenth aspect or facet of the invention in its broadest or most general form. Even as couched in these broad terms, however, it can be seen that this facet of the invention importantly advances the art.
  • this integrated system makes maximal use of the hardware, and most of the software, used in the screening for obstacles or weaponry in the water — to check also, concurrently, for weapons or other features that are airborne.
  • the integrated system achieves an extremely high cost efficiency, as well as space efficiency in the typically crowded quarters of a water-craft bridge or surveillance room.
  • the invention is preferably prac- ticed in conjunction with certain additional features or characteristics.
  • the first ana- lyzing means also recognize and image low-altitude airborne objects.
  • Objects of this sort are within the narrow vertical range between the horizon and the water surface as seen from, for instance, a mast-mounted LIDAR subsystem.
  • Such recognition is advantageously based e. g. upon discrimina- tion of pulse returns that precede any surface-flash return for adjacent regions in the field of view.
  • the LIDAR subsystem includes means for sweeping a sequence of the light pulses across such sub- merged, or partially submerged, objects and such airborne objects in a substantially continuous succession.
  • crossing it is intended to include sweeping the pulse sequence across the scene vertically , particularly since this is a favored configuration.
  • the sequence of pulses typically proceeds without interruption from scanning above the horizon to scanning below the horizon — or vice versa.
  • This uninterrupted sequencing is adopted even though the analytical stage, the analyzing means, must apply algo- rithms for the above-horizon returns that are different from those for the below-horizon returns .
  • the benefits of uninterrupted sequencing include full coverage, and rela- tive simplicity of the apparatus. It might be reasoned that this preference pays a modest penalty in wasted optical energy and wasted pulse time — since there are some portions of the sweep that are below the horizon but too close to the horizon to return useful information about submerged objects. As suggested earlier, however, this zone (as well as the region where submerged objects are seen easily) may contain airborne objects of interest.
  • the invention is a LIDAR system for imaging objects.
  • the system includes a pulsed light source . It also includes an array of lenslets receiving light pulses from the source and forming from those pulses a pulsed fan-shaped beam of the light for projection toward such objects to be imaged.
  • the system additionally in- eludes light detectors receiving portions of the pulsed fan-shaped beam reflected from such objects, and developing corresponding image signals in response to the reflected portions.
  • the system further includes some means for analyzing the signals to determine characteristics of such objects or to display successive images of such objects. For general- ity in this document these means will be called the "ana- lyzing means" .
  • the use of a lenslet array offers the beam-homogenizing benefits previously known only in some- what remote fields such as lighting for photolithography. For present purposes this represents an incidental benefit.
  • the eleventh major aspect of the invention thus very significantly advances the art, nevertheless to optimize enjoyment of its benefits the invention is prefer- ably practiced in conjunction with certain additional features or characteristics.
  • the LIDAR system analyzing means include a streak tube for generating the images.
  • the array of lenslets, in forming the fan-shaped beam modifies the angular distribution of light with respect to a long cross-sectional dimension of the fan shape; in this case ideally the array of lenslets increases the energy at lateral extremes of the fan shap .
  • a partic- ularly beneficial application of this aspect of the inven- tion is for a system in which the pulsed light source is a laser .
  • Other preferences include combina- tion of this facet of the invention with others of the in- dependent aspects or facets under discussion. Among these are the first ten facets discussed above — as well as shaping of the lenslet surfaces, and matching of refractive and diffractive properties, as introduced below.
  • the invention is a light-projection system.
  • the system includes a light source, and an array of lenslets receiving light from the source and forming therefrom a fan-shaped beam of the light for projection.
  • the lenslets have sur- faces that modify the angular distribution of light with respect to a long cross-sectional dimension of the fan shape. More particularly the array of lenslets increases the energy at lateral extremes of the fan shape.
  • this facet of the invention importantly advances the art.
  • this aspect of the invention enables an extremely valuable equalization of the light-beam intensity at lateral extremes. This is especially useful in flatten- ing the energy distribution along a laterally extended track such as the intersection of the beam with a generally planar surface , e. g. an ocean surface — the primary application for most of the other aspects of the invention .
  • This aspect of the invention is by no means limited to providing such equalization.
  • this facet of the invention is far more broadly applicable for a wide range of special situations — including, for example, either equalizing or deliberately introducing a variation in the intensity along an intersection of the beam with surfaces of other shapes .
  • Such shapes for instance may be spherical, cylindri- cal, ellipsoidal, planar but angled relative to the beam front, etc. — or entirely arbitrary. Even this, however, is not the full extent of this facet of the invention. It can also be used to modify illumination patterns along a track that is not defined by a physically demar- cated surface. For instance this aspect of the invention in principle can be used to uniformly illuminate a recti- linear path within a fluid medium — as for measurement of scattering, fluorescence, Raman excitation or the like, either by the medium or by particles suspended in it.
  • the invention is preferably practiced in conjunction with certain additional features or characteristics.
  • the light source is a laser.
  • the invention is a light-projec- tion system that includes a light source. It also includes an array of lenslets receiving light from the source — and forming a fan-shaped beam of that light for projection.
  • the lenslets have refractive characteristics, as is commonplace with lenses generally.
  • the array of lenslets also has diffractive characteristics.
  • the refractive and diffractive characteristics are matched for performance at specified projection angles. Advantages of such matching, in relation to the earli- er-presented limitations of prior-art systems, will now be self evident.
  • the invention is a light-projection sys- tem.
  • the system includes a high-power light source. It also includes an array of negative cylindrical lenslets receiving light from the source and forming there- from a fan-shaped beam of the light for projection toward such objects.
  • the negative cylindrical lenslets form vir- tual line images , rather than real high-power images , of the source .
  • air breakdown is avoided by absence of real high-power ima- ges of the source.
  • Such benefits are not limited to the extremely high- power pulsed laser beams needed in large-scale top-down ocean surveillance or extended-range shallow-angle hazard monitoring, or aircraft-carrier operational aids, etc. dis- cussed above. Rather they are potentially of value in any application requiring a very high illumination level ema- nating, as a concentrated fan beam, in an expanding geome- try from a small source.
  • Fig. 1 is a side elevational diagram, after Anderson _______, showing a prior-art partly submerged stationary experimental deployment
  • Fig. 2 is a like diagram but for one basic arrangement according to the present invention, and particularly for a water craft that is in motion
  • Fig. 3 is a like diagram, but enlarged, showing only geometrical relationships of a probe beam with the water surfac
  • Fig. 4 is a corresponding plan view for the arrange- ment of Figs. 1 and 2, particularly with a fan beam
  • Fig. 5 is a like view but with a scanning spot
  • Fig. 6 is a diagram like Fig. 2 but somewhat less schematic and showing some key dimensions ;
  • Fig. 1 is a side elevational diagram, after Anderson _______, showing a prior-art partly submerged stationary experimental deployment
  • Fig. 2 is a like diagram but for one basic arrangement according to the present invention, and particularly for a water craft that is in motion
  • Fig. 3 is a like diagram, but enlarged, showing only
  • Fig. 7 is a representation of a dual display which is one form of informational output such as can be generated by apparatus of the present invention — the upper portion of the display representing a forward elevational view of a scene developed by the apparatus; and the lower portion representing a top plan view of the same scene, particular- ly as developed by a flying-spot form of the invention such as the previously described sixth aspect;
  • Fig. 8 is a like representation of an alternative to the lower portion of the Fig. 7, i . e. a top-plan view dis- play such as developed by a fan-beam form of the invention;
  • Fig. 8 is a like representation of an alternative to the lower portion of the Fig. 7, i . e. a top-plan view dis- play such as developed by a fan-beam form of the invention;
  • FIG. 9 is a graph showing exemplary performance esti- mates for both nighttime and daytime conditions — more particularly, the square of the signal-to-noise ratio (in dB) as a function of range (in meters) for a craft operat- ing at ten knots, with a swath width of four hundred e- ters;
  • Fig. 10 is a graph showing safe operating distance as a function of swath width (both in meters) , also for both night and day, and for three different craft velocities;
  • Fig. 11 is a block diagram of apparatus according to one preferred embodiment of the invention;
  • Fig. 12 is an elevational side view generally like Figs.
  • FIG. 13 is a set of three displays generated by the Fig. 12 embodiment for the information of aircraft-carrier operations personnel;
  • Fig. 14 is a view like Fig. 12 but showing another preferred embodiment as used in scanning a laterally exten- ded fan beam vertically to locate both waterborne and air- borne objects in at least one direction from a monitoring position;
  • Fig. 14A is a highly schematic or conceptual block di- agram showing a LIDAR subsystem (in two primary variants) for use in the Fig. 14 object-locating system;
  • Fig. 15 is a view like Fig.
  • Fig. 16 is a set of simulated images of a submerged object as imaged through ocean wave surfaces having various characteristics such as occur within a span of 15 seconds, the actual size of the object being shown as a white circle in the upper left image, and the parameters used in the simulation being: sea state 2-3, water Jerlov IB (very clear) , object at depth 10 m (38 feet) with diameter 1 m (35 inch) and reflectivity 8% ; Fig.
  • FIG. 17 is a set of related diagrams showing several ways in which ocean waves disrupt LIDAR imaging of sub- merged objects from above the wave surface, views (a) and (b) being coordinated elevations respectively showing dis- tortion zones and objects positioned relative to those zones, and view (c) being a top plan coordinated with view (b) ;
  • Fig. 18 is a single streak-tube LIDAR frame taken from above the wave surface;
  • Fig. 19 is a flow chart representing a method of preparing a LIDAR system that corrects for refraction in LIDAR imaging through waves in a water surface;
  • Fig. 20 is a graph of light return versus depth for both water-volume backscatter and water-surface glints;
  • Fig. 21 is a view like Fig.
  • Fig. 22 is a diagram illustrating reconstruction of a submerged-object image by means of correction for ocean- surface wave refraction
  • Fig. 23 is a somewhat schematic cross-sectional side elevation of a preferred embodiment of the invention that is a positive lenslet array (shown with related optical rays) that can be used to produce a fan beam from an inci- dent collima ed beam such as a laser beam
  • Fig. 24 is a like view, but enlarged, of a single lens element of the Fig. 23 array, showing certain focal rela- tions for the lens
  • Fig. 22 is a diagram illustrating reconstruction of a submerged-object image by means of correction for ocean- surface wave refraction
  • Fig. 23 is a somewhat schematic cross-sectional side elevation of a preferred embodiment of the invention that is a positive lenslet array (shown with related optical rays) that can be used to produce a fan beam from an inci- dent collima ed beam such as a laser beam
  • Fig. 25 is a view like Fig. 23 but with negative lenslets
  • Fig. 26 is a like view showing a relationship between beam angle at the back of the lens and ray heights above the optical axis on the front of the lens
  • Fig. 27 is a like view of a lens optimized to increase the irradiance at the edge of the fan beam
  • Fig. 28 is a like view of a fan beam generated by the lenslet array shown in Fig. 23 or 25.
  • Certain preferred embodiments of the invention use streak-tube imaging LIDAR (or STIL) as a "ship shield” .
  • a LIDAR beam 11 is projected over the water and its return detected, by a forward-looking, low grazing angle LIDAR unit 12.
  • the LIDAR unit is mounted on a mast 13 of a water craft 14 that is in forward motion 15 along the water sur- face 16.
  • the LIDAR beam is generally refracted to form a downward-propagating continu- ing beam 21 within the water.
  • the down-beam interacts with submerged (or floating) objects — thereby modifying the return beam.
  • the system thus enables the water-craft operator to find and avoid mines and other obstacles, or to locate desired objects.
  • the emitted beam and the return-collection beam geometry both have the shape of a vertically thin fan.
  • a beam provides a wide (left to right) detection swath and precise temporal resolution, for detection of even sub- merged targets while rejecting surface clutter.
  • the use of a very thin fan beam is very desirable.
  • Theoretical analysis and scaling from experimental data to a mast-mounted LIDAR — suggest that swath width in excess of 400 m, at ranges exceeding 500 m, is achievable with available technology. As noted above, it is contem- plated to operate the present invention into the range of kilometers. The result is an important expansion of on- board threat avoidance for high-value assets — including civilian operations , as well as mine countermeasures and the like.
  • differential time of arrival ⁇ r (Fig. 3) and total ambiguity.
  • the latter is a measure of overall spatial resolution for the system, in the direction of beam propagation .
  • the differential time of arrival ⁇ r at the surface is actually not in units of time, but rather is a ranging un- certainty in terms of distance. It is found from the fan- beam thickness w and the beam angle ⁇ off the vertical, as
  • Detecting objects such as mines calls for resolution of roughly 30 cm (one foot) .
  • a swath width of 400 m should map to about 1300 pixels. This value is readily accommodated by streak-tube geometry, which images along the axis of the fan beam.
  • the alternative scanning-spot system using e. g. a single-pixel spot, would require a pulse repetition fre- quency 1300 times higher than the STIL system, and also re- quires the scanning capability and associated return-data processing. Nevertheless the scanning-spot facet of the invention does represent a viable alternative for at least some special conditions.
  • the STIL can actually form an image of the surface waves (from the precise range resolution) . Therefore the invention further contemplates deterministic re- moval of subsurface target-image distortion, through using this surface information in conjunction with Snell's Law.
  • a gated camera approach (wide beams) , as mentioned above will not serve well .
  • the reason is the differential time of arrival — with respect to both the fore/aft dimen- sion and the lateral (horizontal) dimension.
  • CCDs charge-coupled detectors
  • a planar fan beam 11 of projected light 31 leaves the mast 13 of a craft 14 that is traveling forward 15.
  • the beam preferably IH to cm (one-half to one inch) thick, intersects the water in a straight swath 17 that is thirty to sixty centimeters (one to two feet) "wide” — ______ in the fore-to-aft direction relative to motion 15 of the craft 14.
  • the transverse dimension 23 of the beam/water intersection 17 is about 400 m (1300 feet) "long” — _____, in the lateral direction relative to motion 15 of the craft 14.
  • the intersection is not straight precisely; its de- partures from precise rectilinearity vary with the weather, and other factors that affect the surface contours .
  • the entire transverse dimension of the beam is reasonably uni- form in energy density and is projected simultaneously.
  • Return information from the wings (left and right corners) of the pattern is reduced in intensity by virtue of the inverse radial dependence of energy in the diverging beam, in conjunction with the greater travel distance to the straight intersection 17 with the water surface. It is also retarded, due to the greater travel distance.
  • the beam is refracted by the water surface O (Fig.
  • Submerged objects 22 such as mines (and also turbidity of the water itself) form a return up beam 35, reverse-refracted to constitute a col- lected reflection beam 32 for detection at the receiver R.
  • the round-trip time of flight to the surface flash yields horizontal range, while the round-trip relative time from the surface flash to the object yields depth of the object.
  • the system thus yields range (very nearly equal to, and directly proportional to, exact hori- zontal range) , bearing, and depth on each single shot — i. e. laser pulse.
  • Attenuation and retardation at the wings tend to dis- rupt slightly the regularity of the display, making objects at the wings appear both darker and lower than objects at the same depth nearer the center. These irregularities can be compensated in various ways if desired.
  • the attenuation is advantageously compensated, as set forth later in this document, by an array of lenslets that sculpts the amplitude of the beam, redirecting energy from central portions of the outgoing and return fan beams to the wings.
  • the retardation is best corrected by software in the image-analytical stage.
  • the scanning-spot LIDAR system projects a beam 11' (Fig. 5) that is not fan shaped but instead is tightly constrained in the horizontal as well as vertical direc- tion.
  • the resulting individual spot or pixel 17' at the beam/water interface beam is swung back and forth 33 to produce a circular arc at the interface.
  • This system is not subject to the range-irregularity disruption just discussed; however, the additional elec- tronics and optics for this form of the invention appear to render it somewhat less desirable.
  • Such a flying-spot scanner could time resolve, but the system would be rela- tively complex.
  • Exemplary numerical values for both forms of the in- vention include range R (Fig. 6) of about 500 m (1600 feet) and nominal vertical angle ⁇ of about five degrees .
  • range R Fig. 6
  • nominal vertical angle
  • the system works over a fairly wide range in vertical angles from the mast.
  • vertical angle ⁇ there is an important tradeoff. Smaller angles yield greater range but lower signal-to-noise ratio for subsurface detail. This limitation is due to poor Fresnel transmission through the beam/water interface at small angles.
  • standoff range R can be traded off against detection depth. Accordingly the five degrees (off horizontal) men- tioned above is only illustrative, but offers a good com- promise for a standoff range of operational utility.
  • the mast height H is about 35 m (110 feet) .
  • the mast setback 18 from the bow may be about 80 m (250 feet) .
  • the complementary "plan view”, or “top view”, or “range-azimuth view” 48 (for the straight fan beam of Fig. 4) displays range 46 vertically, and azimuth 44 — exactly matching the azimuth display at top of Fig. 7 — to left and right. It is a view as taken from above the scene with the water craft 114 at the bottom of the picture, and the present scan 41' — the altitude-azimuth view 41 compressed to an incremental strip of height 47 — at the top of the picture.
  • this plan view is computer-generated by ver- tical compression of many instances of the elevational view, and successive placement of those instances as an array 46 from bottom toward top of the screen.
  • a range-azimuth view may appear as in the lower portion of Fig. 7. This type of view corresponds to that of Fig. 8 discussed above, except for the arc shape of each of the incremental slices 41", 46' formed from the vertically compressed altitude-azimuth views — again with the water craft 114' at the bottom of the view.
  • SNR Signal-to-noise ratio
  • Signal-to-noise is also related to and overall angle of view; the data graphed in Fig. 9 are for ten knots, and a 400 m swath width. They suggest that the standoff range can be traded off against accuracy of imaging. Solar background is an additional noise source that limits detectability . Thus the tradeoff is in any event more favorable at night, when interfering ambient light is minimal.
  • Fig. 10 takes into account two ef- fects : if the craft moves more slowly, (1) detection can be at a greater range (from the SNR analysis above) , and (2) the craft can stop in a shorter distance (thus provid- ing a greater safety distance even if detection range were the same) . Both these effects yield a greater standoff range at lower speeds .
  • the double-ended arrow at the right of the drawing summarizes the significance of the "safety distance" scale at the left. Positions 51 in the graph above the zero- safety-distance line are safe, whereas positions 52 below that line may entail possible contact with mines.
  • a computer 92 Upon operator command entered at a console 94 (Fig. 11) , a computer 92 develops a control pulse 91 to fire the laser transmitter 61.
  • the laser is advantageously a diode- pumped solid state (Nd:YAG) unit transmitting in the blue- green (532 nm) .
  • a beam splitter 62 picks of a small (less than one percent) fraction of the transmitted pulse, which is sent 64 to a fast photodiode 65 for precise measurement of the time at which the pulse leaves the transmitter.
  • the main portion of the beam passes through an anamorphic lens (diverging optic) 63 to form a fan beam 68; the beam re- mains collimated in the elevation axis.
  • the backscattered light 84 is imaged with a precisely coaligned receiver.
  • the receiver optical train includes a bandpass filter (BPF) 83 to reject solar radiation.
  • BPF bandpass filter
  • a conventional imaging lens 82 focuses the scene onto a slit 81, whose width is chosen to match the width of the transmitted fan beam.
  • the image at the slit is relayed by a first fiber optic (FO) 77 to the photocathode of the streak tube 76.
  • the fiber optic may be tapered, allowing a wider re- ceiver aperture for increased collection efficiency.
  • An- other alternative is to use multiple receivers to span the fan beam; in this case each receiver could use a longer- focal-length lens, and a larger aperture.
  • the electron beam is swept using a high-voltage ramp generator 74. This sweep is synchronized with the transmit time, using a programmable delay generator. The consequent range/azimuth image is formed on phosphor at the back (left, as illustrated) end of the streak tube .
  • a second fiber-optic coupling 75 relays the image to a CCD 73 for digitization and readout by the computer 92.
  • the same computer preferably both acquires the data and per orms system control . Forward motion of the platform sweeps out the in-track dimension.
  • the returns may be spatially registered into geodetic coordinates — forming, if desired, a full three- dimensional image of the scene in front of the ship.
  • the computer system includes automatic target detec- tion algorithms to detect and precisely localize hazards to navigation.
  • the operator display also preferably includes a scrolling map display 46' , 46 (Figs. 7 and 8) , such as previously discussed, to provide situational awareness of the area in front of the ship .
  • the grazing-incidence LIDAR can also be used for detection of floating debris (and other hazards to navigation) , peri- scopes, cruise missiles and other threats.
  • the blue-green laser is preferably replaced with a near-IR laser (e. g. , 1.54 microns) to pro- vide eye-safe detection.
  • a near-IR laser e. g. , 1.54 microns
  • the fan beam can be projected at near-horizontal angles, resulting in considerably increased standoff ranges.
  • the standoff range will be controlled by the clear line of sight, which is ultimately limited by the curvature of the earth. Detection ranges of ten to twenty kilometers accordingly may be possible, de- pending upon transceiver height above the water.
  • the search area would include both sides and aft to cover a 360-degree search volume .
  • mast-mounted LIDAR As men- tioned earlier, is as a landing aid for carrier operations.
  • a forward-looking, eye-safe system would provide real-time measurement of the aircraft 122 (Fig. 12) relative to the aircraft carrier 114, and relative to the desired approach path, as well as range and range-rate to guide both pilot and controllers in an operations control center 118.
  • the additional range and range-rate data represent a considerable benefit for aircraft safety.
  • the data from mast-moun- ted LIDAR units could be of considerable utility for un- manned air vehicle (UAV) operations.
  • UAV un- manned air vehicle
  • the system includes a LIDAR transceiver 112 mounted to a mast 113 or directly to a tower on the carrier 114, and also includes a computer-controlled streak-tube module 117 disposed for viewing in the operations center 118. Con- trols of the computer are also advantageously disposed for use by the controllers there.
  • the radiated LIDAR pulses 131 like those discussed earlier in other may be in the form of a thin fan beam directed toward an expected or determined altitude of — for example — an incoming aircraft 122; if desired the fan beam may be scanned vertically to encompass a greater range of possible or permissible altitudes.
  • Such a view may display an image 222 of the aircraft in relation to desired glide-path indicia.
  • indicia may for example include a vertical line 261h representing a desired instantaneous horizontal position, taking into account the known range and calculated range-rate, i. e. velocity; and a horizontal line 261v representing a like desired vertical position.
  • Such instantaneous images can be compressed and accumulated to provide numerous strip-shaped slices 246 of a plan view, as in Fig. 13(c) .
  • This view closely anal- ogous to Fig. 8, shows an image 222 of the aircraft 122 in relation to an image 214 of the carrier 114 — and may also show superposed indicia 261h, 261v as in Fig. 13(b), or al- ternatively may show safe glide-path boundaries 262.
  • the system computer can also be programmed straightforwardly to assemble a side elevation which simu- lates the original scene of Fig. 12.
  • This elevational view can be given various helpful characteristics, as for in- stance a glide-path indicium 261, and as illustrated a greatly expanded vertical scale — since safe landings are particularly sensitive to correct elevation of the aircraft relative to the carrier deck.
  • LIDAR is known to be useful for obtaining information about airborne objects. In terms of cost and space requirements, however, it is objection- able to provide a separate system for such purposes.
  • the present invention contemplates an integrated sys- tem for detecting both waterborne objects 322s, 322f (Fig. 14) and airborne objects 322a, 322b from a water craft 314.
  • a LIDAR subsystem 312 which is mounted to the water craft, emits thin fan-beam light pulses 311z, 311a, 311b, 311n from above the water surface 316 toward submerged ob- jects 322s or floating objects 322f, and also toward air- borne objects 322a, 322b.
  • the LIDAR subsystem 312 also de- tects reflected portions of the emitted pulses .
  • the LIDAR subsystem 312 includes a module 331 (Fig. 14A) which is the portion mounted well above water level to a mast, flying bridge etc.
  • the laser beam in principle can be piped to the LIDAR module 331, that module more typically includes the transmitting laser 373 itself, with electronic control circuits 333-345 and power 372 for pulsing the laser.
  • Also included in the LIDAR module 331 are associated mechanical and optical components 371, 374, 375 devoted to support and orientation of the laser, sweep of its beam, and transmission of the swept beam 376 toward objects and regions of interest 322 (Figs. 14, 15).
  • the optics 375 may partially precede or encompass the beam sweep 374, for simplicity's sake the sweep 374 and optics 375 are shown as sequential.
  • the optics 375 preferably include the collecting and transmitting stages — integrated or at least intimately associated, as suggested in the drawing.
  • the streak tube 378 for re- ceiving the collected return beam 377, is advantageously also housed in the LIDAR module 331.
  • Adequate implementation in general is processing-in- tensive. Both pulse generation and data interpretation are performed in portions 330, 331 of one or more programmed processors that at present are typically digital electronic devices but may instead be implemented in optical circuits for greater throughput.
  • the processing may be equiv- alently performed by dedicated firmware in a general-pur- pose processor, or by an application-specific integrated circuit (ASIC) 331 in the LIDAR module — or by software 332 operating in a processor that is part of a raster image processor (RIP) or general-purpose computer 330.
  • the control software may be regarded as a "driver" for the LIDAR/streak-tube system.
  • a raster image processor is nowadays highly favored for managing (in a well-optimized manner) the intensive data massage typically associated with images — while relieving both the specialized hardware 331 and the gen- eral-purpose computer for other tasks.
  • Programming for the LIDAR subsystem is preferably held in onboard memory 333 as discussed above, and supplied to (or in the case of an ASIC integrally incorporated in) 334, 335 the functional modules.
  • a pulse/sweep control block 337 controls pulse power 372 for the laser 373, and also develops synchronized control signals 381, 382 for the laser sweep 374 and streak tube 378 (with its internal sweep) .
  • the pulse and sweep processing 337 provides for emis- sion of pulses toward both a region 341 in which submerged (or partially submerged) objects can exist — i . e. , below the horizon — and also a region 342 in which exclusively airborne objects can be present, . e. above the horizon.
  • the continuous-succession function 344 or interrupted-succession function 345 will be taken up below.
  • the pulse and sweep timing and other parameters are passed 346 to the detection module 347, which receives data 379 from the streak tube 378 — representative of the re- turned beam 377.
  • the outputs of the detection module 347, namely detected-reflection data 351, proceed to the image- processing block 360.
  • This latter block 360 may include a horizon-recogni- tion function 354, which may be particularly critical if the horizon-interrupted-succession alternative 345 is oper- ating.
  • the horizon-recognition function 354 is shown concep- tually as encompassed within the image-processing block 360. In practice, however, for purposes the interrupted- succession sweep function 345 the LIDAR subsystem may acquire horizon data from a separate data train that is not physically integrated into the illustrated primary optical componentry and possibly not even integrated into the main processors. It should be noted, however, that a horizon-recogni- tion function 354 — at least one derived preliminarily from the optical returns themselves — is desirable even in the continuous-succession-sweep operating mode 344. This is because horizon information is needed to route data to the processing channels 362-366 discussed below.
  • the principal function within the image-processing block 360 is analysis 361 of the reflected and detected portions of the laser pulses .
  • substance three parallel processing channels 362-364, 365 and 366 can be recognized.
  • the first two of these operate separately upon the re- turns emitted toward the submerged, or partially submerged, objects — i . e. , upon the return of laser pulses directed below the horizon.
  • the goal is to derive images of submerged (or partly submerged) objects.
  • This goal requires preliminary determination 362 of water-surface orientations — based upon the same returns from the water, but upon the ranging component of those returns rather than the image content as such.
  • the surface orientations in turn enable the overall system to develop a correction 363 for refractive distortion.
  • this refractive correction is applied to the image content to provide submerged-object images 364 with greatly reduced distortion.
  • the second channel 365 is able to de- termine whether any portion of the return from below-hori- zon laser pulses is airborne, rather than waterborne.
  • This information too can be extracted from ranging data: a re- turn that is greatly advanced relative to that from the highest waves must be above the surface.
  • the concept of "the highest wave” requires assumptions of at least some minimal degree of continuity in the wave surface.
  • the refractive correction 363 developed in the first processing channel 362-364 is omitted in the second channel 365.
  • the third channel 366 separately analyzes the reflected and detected portions of pulses emitted toward airborne objects exclusively — i . e. , the return of pulses directed above the horizon.
  • the objective here is again parallel, namely to derive airborne-object images from the detected pulse data.
  • the image processing in this region must proceed differently — and more simply, without attempting to determine anything about any intervening surface. There is no need to apply either (1) any refractive correction or (2) any preliminary ranging or dynamic-motion analysis, for in this region there is no need to distinguish airborne from waterborne objects.
  • the LIDAR subsystem sweeps a sequence 311z-a-b-n (Fig. 14) of the light pulses across such submerged, or partially submerged, objects and such airborne objects in a substan- tially continuous succession.
  • across here it is intended to encompass use of a vertical sweep.
  • the sweep may proceed downward from a position 311z near the zenith, through the horizontal 300, toward and to the nearest desired location 311n to be mon- itored within the water; or instead upward from such a clo- sest desired location 311n, past the horizontal 300 and on- ward to the near-zenith position 311z.
  • a particularly preferred form of the sweep does both — i .
  • the invention nevertheless encompasses another preferred embodiment also discussed above, though less highly preferred, in which the pulsing is interrupted 345 to skip past objects in a narrow range of elevations at and just below the horizon.
  • the sweep itself may also be made discontinuous, slewing more rapidly past the same region .
  • This embodiment 345 has an advantage: it avoids wast- ing time intervals in which the system can be receiving and processing real data.
  • the entire LIDAR module 331 — or the laser/optics elements 371, 373-376 and streak tube 378 — may be mounted 371 in a gimbal box that is continuously servocontrolled to hold the fan beam 311 accurately parallel to the horizon.
  • Certain preferred embodiments of this invention can provide undistorted, or less distorted, images of un- derwater objects when looking through the ocean surface from above . Correct classification of objects, and in particular fewer false alarms, result from capturing both the distor- ted image of the underwater object and the shape of the ocean surface above that object.
  • image-distorting effects at the ocean surface can be removed from the image (without any a priori knowl- edge of the target) by image-reconstruction algorithms.
  • the surface map contains an image of the surface.
  • the system can directly identi- fy whitecaps and other surface phenomena that are a signif- icant source of false alarms .
  • the invention encompasses surface-finding algorithms that are used to generate the ocean surface map .
  • the sur- face-mapping algorithm can itself be tested using existing streak-tube LIDAR data, associated with known surface characteristics.
  • the invention can also produce an accurate area-cover- age estimate at different depths, ______ a measurement of how well — how uniformly — the system is illuminating and imaging each water layer.
  • the previously mentioned gaps in coverage, due to wave focusing and defocusing of probe rays, can now be estimated with much greater precision as to both size and niimber, and deterministic feedback thereby derived for the system operators.
  • the present invention includes a subsurface-image reconstruction algorithm. This algorithm can be modeled to determine what spatial sampling and sig- nal-to-noise ratio (SNR) are needed to enable improvement of the images in any given environment — combination of weather , water type , etc .
  • SNR sig- nal-to-noise ratio
  • any new system specification and fabrication of optics necessary to meet the requirements for spatial sampling and SNR are defined from the modeling efforts.
  • the optics are then straightforwardly integrated with an existing streak-tube LIDAR system such as those provided by Arete Associates, of Sherman Oaks, California. Ocean flights can then be performed over actual or (if further validation is desired) realistic simulations of ma- rine obstacles or weapons, and the image-reconstruction al- gorithm used to correct the images .
  • performance rather willy-nilly in prior-art systems — now conforms deterministically to the desired characteristics built into the system by virtue of the ad- vance modeling.
  • FIG. 16 Ocean surface estimation and underwater image recon- struction — More specifically, prior airborne LIDAR sys- tems , have demonstrated that considerable degradation of images (Fig. 16) of objects, including small targets such as submerged weapons, can occur due to the distorting ef- fects of the ocean surface. This distortion can degrade the probability of detection and classification, as well as the ability to reject false alarms.
  • the illustration represents a sequence of simulated images of an object with wave distortion. The actual size of the object is shown as a white circle in the upper-left image . This series of views represents changes that occur in such an image in less than fifteen seconds. Parameters for this illustration included sea state 2 to 3, water Jerlov IB (very clear) , object diameter approximately one meter (35 inches) and reflectivity eight percent, at depth of ten meters (33 feet) .
  • the pulsed laser-transmitter light is spread into a wide-angle fan beam that forms a line image in the water.
  • the streak-tube receiver has the ability to capture this line image every nanosecond (or faster) up to 1,024 times for a single laser pulse (with spatial sampling of up to 1,024 pixels) .
  • the streak-tube LIDAR system can measure the re- fleeted light from the transmitter pulse every 15 cm (six inches) in air, or — allowing for the different refractive index — 11 cm (4.4 inches) in water.
  • Reliable performance of subpixel localization in range to less than 2P. cm (one inch) has been demonstrated on underwater targets. All this means that the streak tube can accurately characterize the height of the ocean, with the resolution necessary to meaningful image reconstruction.
  • the system not only measures the ocean surface but also simultaneously images underwater objects.
  • view (a) shows the zones where LIDAR data is degraded due to illumination gaps 107 intimately adjoining a crest focus region 106 — or ray mixing in the crest focus region 106 itself (to which rays from the gaps 107 are redirected) , or ray separation in a crest defocus region 108.
  • View (b) shows several hypothetical, potentially de- tectable articles 101-105 positioned in relation to the identified regions in view (a) .
  • View (c) shows the corre- spending images 101' -105' seen from an airborne system for each of those numbered articles 101-105 respectively. As the drawing shows, objects 101 in the converging- ray region above the crest focus 106 produce distorted images 101' .
  • FIG. 18 shows a single frame of STIL data in which both the ocean surface 401 and the bottom are visible. This frame shows the strong signatures 401, 404 from the water surface and water volume respectively, as well as the return 403 from the bottom. The evident confusion as be- tween these two components will be discussed shortly. (The seeming curvature in the surface is caused by the increase in lateral range due to larger nadir angles . ) The water surface is shown overexposed.
  • this surface detection and mapping capability has other advantages: (1) coverage estimate at depth, (2) direct sea-state measurement, and (3) false-alarm mitigation. Coverage estimate at depth: Because the system recon- structs the surface, the positions of absent data or low- SNR gaps 107, 108 (Fig. 17) are deterministically known in all three dimensions. Again, this does not mean that it is possible to know what was in those gaps, but the system can provide a pre- cise measurement of how much of the water volume was sam- pled at appropriate SNR.
  • the requirement for addi- tional passes over an area can be quantitatively defined by the sensor itself, as opposed to depending on model-based statistical coverage estimates , which in turn is dependent on an estimate of the sea state.
  • Direct sea-state measurement The streak-tube system can make a direct measurement of the sea state . These data can be provided to all involved parties that are dependent on this — ranging from amphibious landing forces to recre- ational small-boat operators.
  • False-alarm mitigation The system can also make an image of the surface. This image can be used to examine surface objects, such as whitecaps , that sometimes resemble weapons and thereby represent a significant source of false alarms in existing LIDAR systems.
  • the relationship between system performances under various conditions can then be studied and understood most meaning- fully.
  • the reconstruction algorithm can then be run and evaluated 615-616 very objectively, since the well-defined input (from 611) to the image simulator can be compared 616 directly with the output from the reconstruction.
  • the robustness validation at 617 determines the ef- fects upon the output image of variation in resolution and noise in the assumed surface map. Rigorous subjection of the system to and beyond known real-life extremes of sea- surface condition is essential at this point.
  • the image-reconstruction algorithm itself need not follow any specific form, other than a systematic applica- tion 614 of Snell's law in tracing of rays through the vis- ible water surface to correctly define the positions of objects with the confusing effects of the water surface removed.
  • a satisfactory starting point 613 for the algo- rithm is any of the operationally sound prior-art algo- rithms (see discussions following) , modified in a straight- forward way to perform these ray tracings . This effort can be guided by the principles set forth in the previously mentioned patent 5,528,493, for observa- tion of vessels and landmarks from a submerged station, as well as other technical literature that is discussed below.
  • a pivotal difference is that for present purposes it is not necessary to at first crudely estimate and then iteratively, dynamically refine a mathematical representa- tion of the instantaneous sea surface. Instead, as pointed out earlier, that surface is sim- ply measured directly by ranging at each pixel — and it is this capability which is the object of the second-algorithm development 623. Therefore all the iterative dynamic-de- velopment portions of the process discussed in the '493 patent may be simply omitted 624 from the algorithm used in the present invention. With the second algorithm specified 623 and coded, it is verified 625, as described above, against a known sur- face used in this simulation.
  • This later validation should confirm the insensitivity of the system to spatial resolu- tion and noise in the surface-position estimate, previously verified through variation 617-618 in verification of the first algorithm.
  • the latter portions of at least the first-algorithm modeling are then, as illustrated, used to derive 621 the resolution requirements for the hardware.
  • Further benefit, however, can be obtained by working through the second, surface-mapping algorithm development 623-625 before final- izing the optical specifications .
  • Development 623 of the surface-mapping algorithm too can be based on prior working systems (see discussion be- low) — but omitting the surface-determination iterations needed in those earlier e forts .
  • This second algorithm can be tested 625 preliminarily using existing STIL data col- lected for an ocean patch, but eventually should be cross- compared 626, 628 using measurement of at least a small surface that is independently known.
  • Flight performance verification The streak-tube LI- DAR system is compact and lightweight, and readily mounted in an aircraft — a small fixed-wing aircraft is entirely adequate.
  • the system is then flown 626 over the ocean, preferably near a coastline where varying benthic depth can provide many naturally occurring test characteristics. Because it is difficult to get good "ground truth" on the ocean surface height, it is helpful to place 627 known test objects in the water at different, but well-surveyed or otherwise well-defined, depths.
  • One kind of target ad- vantageously included in such tests is a large grid struc- ture, preferably at least roughly six meters (twenty feet) in each direction.
  • an artificial strongly contoured sur- face may be constructed and emplaned 628 the LIDAR system flown over that surface — ______, with the surface exposed in the air.
  • the ability of the system to measure and reconstruct the surface is estimated from the quality of the reconstructed object images.
  • a system may be put into operation but may not perform satisfactorily even though it creates a surface map, and even if the map is accurate, if the system does not adequately resolve adjacent elements of the ocean surface — or resolve the continuum of heights of the sur- face .
  • the measured positions and heights are inserted into the ray tracing function, and the only real test of their adequacy is whether the ray tracing produces correct pic- tures of submerged objects.
  • an operating system gives high spatial resolution in all three dimensions , if the measure- ments in those dimensions are imprecise or inaccurate then again the ray-tracing results will be deficient.
  • the ocean reflects light 404 from the molecules and particles within the volume (i . e. , water volume backscatter) .
  • the specular return 401 called “glints"
  • the difficulty that these two different returns cause in estimating the surface position is that the volume back- scatter return 404 reaches its peak value (Fig. 20) only after the laser pulse is entirely within the water volume, while the glint signal 401 reaches its peak when the peak of the pulse arrives at the water surface (see Fig. 20) .
  • the distance between these two peaks 401, 404 is on the order of the width of the laser pulse.
  • This width for a typical 4- to 5-nanosecond laser, corresponds to more than a half meter (20 inches) in position difference.
  • a half meter is a huge distance. Such uncertainty would render futile any effort to establish a surface-height map for use in recon- structing undersea images.
  • the data and exploratory algorithms devel- oped by Guenther' s group mentioned earlier, can be used in generating a robust surface-finding algorithm able to dif- ferentiate between glints and volume backscatter.
  • the gen- erated algorithm can provide a good surface map from each laser pulse, or each succession of pulses.
  • the resultant surface-mapping algorithm is then best applied to existing STIL data taken during other tests.
  • These data e. g. Fig. 21
  • adequate spatial resolution for performing image reconstruction must be maintained while both imaging a reasonably wide swath and holding a reasonable airspeed.
  • the reconstruction algorithm is a deter- ministic, noniterative ray-tracing program.
  • the complete algorithm proceeds in simple steps: from a known position of the LIDAR subsystem 12 (Fig. 22) , fan- beam pulses probe the wave surface 16. For each pixel 17' the three-dimensional orientation of a corresponding ray 31' is known by the apparatus inherently. The distance to the surface along the ray 31' is found by analysis of the pulse returns . Combined with the ob- serving position and ray orientation, the system computer straightforwardly obtains the three-dimensional position of the pixel 17' .
  • n is the index of refraction for the water.
  • the system has the true disposition 21' of the ray.
  • the system can also take into account the further return delay to an apparent object point 22a' seen behind the same pixel 17' .
  • that apparent point 22a' is apparently positioned along the apparent or rectilinear extension 21a to the same ray 31' , it is known to lie in- stead along the refracted, true ray disposition 21' .
  • the further return delay considered in conjunction with the speed of light within the water, yields the addi- tional range or distance within the water to the true ob- ject point 22' .
  • the goal is to produce a nearly one-dimensional fan beam 711 (Fig. 23) from an incident laser beam 709 of conventional character, __________, most typically of a very generally circular cross-sectional envelope, but roughly Gaussian across that section.
  • laser beams typically have both hot (bright) spots and dark specks within the beam cross-section — and these features as well as the overall energy distribution and indeed the effective centerline of the beam itself vary greatly during pulsed operation.
  • Preferred embodiments of the present invention (1) produce a fan beam of prescribed angular width independent of size, energy distribution, and position of the input la- ser beam 709; (2) enable shaping of the energy distribution across the fan beam; and as a kind of bonus also (3) homog- enize the laser beam to remove any nonuniform spatial energy distributions in the laser beam.
  • the input laser beam 709 is incident on an array 721 of small lenses 722, which in this document are sometimes called "lenslets".
  • this document refers to the individual elements 722 as "cylin- drical" but it is to be understood that this term refers to elements that are not truly cylindrical.
  • the spa- tial profile of the laser beam is homogenized by the mixing of the light after the array.
  • Each of the lenslets 722 expands the portion of the beam incident on it. Since the beam 709 is incident on many lenslet , there is considerable averaging in the angu- lar distribution leaving the lens array. Because the lenslets are overfilled with illumination, the numerical aperture of the lenses defines the angular divergence ⁇ — here the fan half-angle — of the light. The sine of the fan half-angle equals the numerical aper- ture NA, which in turn is established by the relationship between the focal properties of each lenslet surface and the physical aperture.
  • Fig. 24 illustrates one lens in an array of positive cylindrical lenslets.
  • the array may be fabricated either by casting or otherwise forming a unitary article as sug- gested by the illustration, or by forming individual lens- let elements and placing them side by side.
  • Each lenslet 722 nominally produces a line focus 723 of the input laser beam 709 after (_____, downstream of) the lenslet.
  • Another potential drawback of highly consistent diver- gence angle is that all the optical energy passes through the foci of the several lenslets. In other words the lens- lets form real images of the collimated input beam, and all the energy of the input beam passes through the array of these images. A highly compensated system therefore concentrates in a minute, roughly planar zone — defined by the array of line foci of the several lenslets — all the power of the source beam.
  • the angle ⁇ (Fig. 26) of a refrac- ted ray leaving the lens is related to the height y of the ray above the optical axis 725.
  • the relationship may be expressed by the following integral.
  • the angular weighting function the derivative of sin( ⁇ )
  • cos ( ⁇ ) the angular weighting function
  • the angular weight function can be approximated from geometrical ray-tracing data by
  • Fig. 27 shows a lens optimized to increase the irradi- ance at the edge of the fan beam — perceptible in the drawing in terms of the density of rays 726, 727c, 728c — to four times that at the center of the fan beam.
  • the lens was optimized by varying the surface profile 722c of the refracting lens surface.
  • Fig. 28 illustrates the angular ray distribution far from the lens. Again, the progressively higher ray density 726, 727c at the edge of the field of view corresponds to higher energy density than does the density 727c, 728c pro- gressively closer to the axis 725.
  • ⁇ m is the diffracted angle for the m th order
  • D is the width of the lens element.
  • NA of the lens defines the largest value of the left-hand side of Eq. (5) ; therefore we can rewrite the equation to give the total number N of diffracted orders across the full width of the fan beam as follows.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un système destiné à la détection d'objets à partir d'une position élevée, comprenant un sous-système de lidar (331) monté en position élevée, destiné à l'émission d'impulsions de lumière sous forme de faisceau en éventail fin à un angle très faible et à la détection de parties réfléchies des impulsions sous forme de faisceau en éventail au même angle très faible. L'invention concerne par ailleurs un sous-système de tube à balaye de fente (378) destiné à l'obtention d'images de parties d'impulsions sous forme de faisceau en éventail réfléchies, successives.
PCT/US2000/024098 2000-09-01 2000-09-01 Lidar avec systeme d'imagerie a tube a balayage de fente WO2002025209A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2001229026A AU2001229026A1 (en) 2000-09-01 2000-09-01 Lidar with streak-tube imaging
PCT/US2000/024098 WO2002025209A1 (fr) 2000-09-01 2000-09-01 Lidar avec systeme d'imagerie a tube a balayage de fente

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2000/024098 WO2002025209A1 (fr) 2000-09-01 2000-09-01 Lidar avec systeme d'imagerie a tube a balayage de fente

Publications (1)

Publication Number Publication Date
WO2002025209A1 true WO2002025209A1 (fr) 2002-03-28

Family

ID=21741734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/024098 WO2002025209A1 (fr) 2000-09-01 2000-09-01 Lidar avec systeme d'imagerie a tube a balayage de fente

Country Status (2)

Country Link
AU (1) AU2001229026A1 (fr)
WO (1) WO2002025209A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011091726A1 (fr) * 2010-01-29 2011-08-04 哈尔滨工业大学 Procédé à haute résolution pour la détection d'ondes à échelle microscopique de la houle marine sur la base d'une imagerie laser
CN103905799B (zh) * 2014-04-23 2017-04-19 中国海洋石油总公司 浅水测试场的监控结构
US9753140B2 (en) 2014-05-05 2017-09-05 Raytheon Company Methods and apparatus for imaging in scattering environments
CN109752727A (zh) * 2019-01-11 2019-05-14 山东科技大学 一种机载LiDAR测深海气界面折射改正方法
CN113167857A (zh) * 2018-12-10 2021-07-23 Abb瑞士股份有限公司 雷达传感器以及使用雷达传感器的机器人
CN115639571A (zh) * 2022-10-31 2023-01-24 哈尔滨工业大学 条纹管成像激光雷达图像坐标校正方法及装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231480A (en) * 1990-10-24 1993-07-27 Kaman Aerospace Corporation Airborne imaging lidar system employing towed receiver or transmitter
US5243541A (en) * 1991-10-11 1993-09-07 Kaman Aerospace Corporation Imaging lidar system for shallow and coastal water

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231480A (en) * 1990-10-24 1993-07-27 Kaman Aerospace Corporation Airborne imaging lidar system employing towed receiver or transmitter
US5243541A (en) * 1991-10-11 1993-09-07 Kaman Aerospace Corporation Imaging lidar system for shallow and coastal water

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011091726A1 (fr) * 2010-01-29 2011-08-04 哈尔滨工业大学 Procédé à haute résolution pour la détection d'ondes à échelle microscopique de la houle marine sur la base d'une imagerie laser
CN103905799B (zh) * 2014-04-23 2017-04-19 中国海洋石油总公司 浅水测试场的监控结构
US9753140B2 (en) 2014-05-05 2017-09-05 Raytheon Company Methods and apparatus for imaging in scattering environments
CN113167857A (zh) * 2018-12-10 2021-07-23 Abb瑞士股份有限公司 雷达传感器以及使用雷达传感器的机器人
CN113167857B (zh) * 2018-12-10 2023-03-24 Abb瑞士股份有限公司 雷达传感器以及使用雷达传感器的机器人
CN109752727A (zh) * 2019-01-11 2019-05-14 山东科技大学 一种机载LiDAR测深海气界面折射改正方法
CN109752727B (zh) * 2019-01-11 2022-03-04 山东科技大学 一种机载LiDAR测深海气界面折射改正方法
CN115639571A (zh) * 2022-10-31 2023-01-24 哈尔滨工业大学 条纹管成像激光雷达图像坐标校正方法及装置

Also Published As

Publication number Publication date
AU2001229026A1 (en) 2002-04-02

Similar Documents

Publication Publication Date Title
US6836285B1 (en) Lidar with streak-tube imaging,including hazard detection in marine applications; related optics
CN109557522B (zh) 多射束激光扫描仪
EP2866051B1 (fr) Dispositif de détection et de télémétrie laser pour détecter un objet sous une surface d'eau
EP2866047B1 (fr) Système de détection pour détecter un objet sur une surface d'eau
US5087916A (en) Method of navigation
CA2437897C (fr) Systeme et procede base sur le lidar
US20210055180A1 (en) Apparatuses and methods for gas flux measurements
Pfeifer et al. Laser scanning–principles and applications
Cossio et al. Predicting small target detection performance of low-SNR airborne LiDAR
Steinvall et al. Experimental evaluation of an airborne depth-sounding lidar
Steinvall et al. Airborne laser depth sounding: system aspects and performance
Filisetti et al. Developments and applications of underwater LiDAR systems in support of marine science
LaRocque et al. Airborne laser hydrography: an introduction
KR101678124B1 (ko) 전방향 라이다 장치를 이용한 라이다 데이터 모델링 방법
Sakib LiDAR with Pulsed Time of Flight
Cossio et al. Predicting topographic and bathymetric measurement performance for low-SNR airborne lidar
WO2002025209A1 (fr) Lidar avec systeme d'imagerie a tube a balayage de fente
Hu Theory and Technology of Laser Imaging Based Target Detection
Guenther et al. Laser applications for near-shore nautical charting
Hyyppä et al. Airborne laser scanning
Mandlburger et al. Feasibility investigation on single photon LiDAR based water surface mapping
Devereux et al. Airborne LiDAR: instrumentation, data acquisition and handling
Niemeyer et al. Opportunities of airborne laser bathymetry for the monitoring of the sea bed on the Baltic Sea coast
Niemeyer et al. Airborne laser bathymetry for monitoring the german baltic sea coast
US11519997B1 (en) Broad-area laser awareness sensor

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP