WO2016164021A1 - Automatic marine flare or seep detection from echosounder data - Google Patents

Automatic marine flare or seep detection from echosounder data Download PDF

Info

Publication number
WO2016164021A1
WO2016164021A1 PCT/US2015/025068 US2015025068W WO2016164021A1 WO 2016164021 A1 WO2016164021 A1 WO 2016164021A1 US 2015025068 W US2015025068 W US 2015025068W WO 2016164021 A1 WO2016164021 A1 WO 2016164021A1
Authority
WO
WIPO (PCT)
Prior art keywords
marine
flares
data
flare
echosounder
Prior art date
Application number
PCT/US2015/025068
Other languages
French (fr)
Inventor
Jin-Hyeong Park
Ti-Chiun Chang
Xuefei GUAN
Shaohua Kevin Zhou
Original Assignee
Siemens Healthcare Gmbh
Siemens Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Healthcare Gmbh, Siemens Corporation filed Critical Siemens Healthcare Gmbh
Priority to PCT/US2015/025068 priority Critical patent/WO2016164021A1/en
Publication of WO2016164021A1 publication Critical patent/WO2016164021A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00
    • G01V9/007Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00 by detecting gases or particles representative of underground layers at or near the surface

Definitions

  • Figure 3 shows an example filter shape for directional filtering
  • echosounder data is obtained.
  • a multibeam echosounder is mounted on a vessel.
  • the echosounder data is obtained by sonar pinging and receiving echoes.
  • the echosounder data is obtained by data transfer and/or loading from memory.
  • the data is from any stage of processing, such as raw beamformed data, filtered data, motion corrected data, scan converted data, or displayed image data.
  • the possible flares are identified as a whole or by voxel.
  • the identification process identifies a collection of voxels
  • any identification process may be used, such as template matching, edge detection, data quality measures, and/or correlation between data.
  • the identification is by directional filtering. Additional, different, or fewer acts may be provided for identification of the possible marine flares.
  • directional filtering to enhance elongated structures over a range within a limited angle (e.g., +/- 20 degrees) relative to the depth dimension may be used.
  • the directional filtering is repetitively performed for elongated structures over any number of different orientations distributed over any range (e.g., all orientations with 360-degree ranges over three dimensions with any angle step size).
  • an elliptical cylinder may approximate the shape of the flare. Any elliptical, cylindrical, and/or other flare approximating shapes may be used.
  • a three-dimensional anisotropic filter that has the shape of a tube or cylinder when the iso-surface of the filter kernel is rendered is used.
  • the major axis of the shape is along the depth dimension or other angle within the search range for filtering.
  • anisotropic or directional filtering provides for all pass or low pass filtering along a direction or range of directions and band or high pass filtering along other directions.
  • low pass filtering is performed along the depth dimension and filtering as a function of a Gaussian kernel is performed along dimensions orthogonal to the depth dimension (i.e., ping and vessel travel directions). Any three-dimensional bandpass filtering may be used.
  • FIG. 3 is an iso-surface plot for one example three-dimensional bandpass filter.
  • the iso-surface plot of the filter is elongated in the depth direction y (i.e., vertical in the physical world).
  • the filter shape follows the second derivative of the
  • Gaussian function as depicted in Figure 4.
  • This second derivative of the Gaussian function is a ridge detector, but other functions may be used.
  • lowpass filtering is applied, as depicted in Figure 5.
  • the second order derivative of the anisotropic Gaussian function: in the ping plane orientation x, and the ship orientation z is represented as: where C is a normalization constant, and , ⁇ ⁇ , ⁇ 7 , ⁇ ⁇ are, respectively, the standard deviations in the x, y, and z directions.
  • Other Gaussian or non- Gaussian filter kernels may be used.
  • Figure 6 shows an example output of convolution of this example filter with the echosounder data represented in Figure 1 .
  • the filtering enhances the flare structure as indicated generally at 32. While the drawings are two-dimensional, this enhancement occurs over three dimensions.
  • a threshold is applied to the results of the filtering.
  • the processor identifies the voxels or locations associated with possible marine flares by thresholding the filter output. Any location having an intensity above the threshold or at and above the threshold is a possible marine flare. The locations for the possible marine flares are found as having high intensities after the directional filtering.
  • other processes than thresholding are used, such as performing edge detection, random walker, segmentation, or other data process for finding locations or groups of locations having greater intensity than other locations after directional filtering.
  • the resulting locations or voxels may be low pass filtered to remove outliers or voxels not located near other possible flare voxels.
  • the processor categorizes the possible marine flares as either detected or actual marine flares or non-flares (i.e., false positives).
  • the possible marine flares are assigned to binary classes.
  • a third class may be provided, such as non-determinative.
  • the categorization is for each voxel individually or for groups of voxels.
  • the processor categorizes without user input, but input may be provided for a semi-automated categorization. Any process may be used for categorization.
  • Acts 22 and 24 are two examples of processes used for categorizing in act 20. Different processes or combinations of multiple of the different processes may be used, such as voting or weighted voting schemes.
  • the possible marine flares are categorized with a machine-trained classifier.
  • a learning-based approach may avoid having to fine-tune a parameter to avoid false positives, especially where tuning parameters for one situation may not be appropriate for another situation (e.g., different image quality or noise levels).
  • the machine-learnt classifier distinguishes between false positives and actual marine flares, avoiding having to set the threshold to do so.
  • a machine-learnt classifier may handle variance in quality, noise, or other situational factors while accurately categorizing.
  • the machine-learnt classifier is a binary classifier trained to classify each voxel into two classes: marine flare voxel or non-flare voxel.
  • Any machine-learning algorithm may be used, such as a deep learning algorithm, support vector machine, neural network, sparse auto-encoding, Bayes network, decision stump, or other classifiers may be trained and applied.
  • the machine training is supervised or semi-supervised in learning a relationship between input features and the output binary classification. Unsupervised learning of the features to use may be provided. Hierarchal or other approaches may be used.
  • Training data (marine ultrasound volumes) is collected.
  • the volumes are filtered to identify the possible marine flare voxels.
  • the true flare candidates which correspond to the ground truth, are used as positive training data.
  • the possible flares that are false positives in the training volumes are used as negative training data.
  • features are extracted to train the binary voxel classifier to separate each possible flare voxel into true flare or non-flare.
  • the features are extracted from the original training volume data (e.g., echosounder data prior to anisotropic filtering), the filter response map (e.g., output of the directional filtering), identified locations of possible flares, and/or other information.
  • both the source echosounder data for the volume and the thresholded information after identification are used.
  • any type or types of features may be used.
  • a three- dimensional patch or kernel around a voxel is used to extract the features for the voxel.
  • Any three-dimensional features such as Haar-like features, steerable feature, and/or scale-invariant feature transform (SIFT) features are extracted for training.
  • SIFT scale-invariant feature transform
  • a high-dimensional feature space is defined as the pattern or voxels in the patch or kernel.
  • the voxels output by the filter response are used to create the high-dimensional feature space.
  • the filter response map is treated as a representation of a volume for extracting features. This high-dimensional feature space is used to train the voxel classifier.
  • the processor inputs values for extracted features derived from an echosounder volume and outputs the classification.
  • the machine-learnt classifier categorizes voxel by voxel.
  • the voxels being categorized are the voxels identified as possible marine flares.
  • the machine-learnt classifier outputs the binary classification as either actual marine flare or false positive for each voxel.
  • the processor performs connected component analysis on the locations of the possible marine flares.
  • the voxels identified as possible marine flares are grouped. Any possible flare voxels directly adjacent along the three dimensions and/or other directions are connected. These connections cluster or group the voxels labeled as possible marine flares. Other distances than directly adjacent may be used in clustering. [0055] Any number of groups may be created. Two or more groups may be part of a same flare, such as where there is a gap in the flare. Low pass filtering or further directional filtering may be applied to the grouping or identified possible locations to remove such gaps. Alternatively, further clustering connects the gaps across any separation of a maximum distance. In other embodiments, the groups are left separate.
  • the processor analyzes the groups.
  • the member voxels or locations for each group or cluster are classified together.
  • the connected possible flare voxels are categorized as being detected marine flares or false positives.
  • the processor calculates cluster properties, such as area, orientation, bounding box, and/or eccentricity. Any characteristic of the cluster or group may be used. For example, the expected flare shape, size, and/or orientation are used. Additional, different, or fewer characteristics may be used.
  • the processor calculates the value or values of the property or properties of the group.
  • the processor determines the value for the size, orientation (e.g., angle of the longest distance in the cluster), or other property.
  • the processor detects whether each group is a non-flare or actual marine flare. For example, interesting or actual flares should be elongated in the y direction (depth) with size (e.g., number of voxels or volume of cluster) bigger than a certain value. Those clusters that do not satisfy these criteria are indicated as false positives.
  • Figure 8 illustrates an initial false detection that has a pixel cluster 34 elongating horizontally. The voxels of this group are categorized as non-flare. The eccentricity property is used to categorize.
  • the processor detects the clusters as non-flares, leaving other clusters as the actual flares. Alternatively or additionally, groups with the expected size and/or orientation are categorized as actual marine flares.
  • the processor detects the clusters as actual marine flares, leaving the other clusters as non- flares.
  • the categorization uses three-dimensional information.
  • the filtering, extracted features, and/or clustering are performed in three dimensions rather than just two. In other embodiments, the filtering operations and
  • the processor automatically analyzes a given fan or sector scan representing a plane to detect the marine flare or flares.
  • the two-dimensional processing may be useful when the processor is limited in memory and/or computational power.
  • Real-time, on-vessel flare detection and ping image analysis may be more rapidly and/or easily provided with two-dimensional analysis.
  • an unmanned vehicle (UMV) or remotely operated vehicle (ROV) may be dispatched to explore the details of the marine area.
  • the detection is output in act 26.
  • the location or locations of the detected marine flare or flares are output. Where clustering is used, the locations are output as a flare. Where voxel-by-voxel categorization is used, the locations may be output by voxel. Since the voxels are likely adjacent each other, then the result is output as a flare.
  • the output is an image.
  • a three-dimensional rendering of the intensities of the flare voxels is generated from any viewing angle.
  • a surface or volume rendering may be used.
  • the non-flare voxels, whether from possible flares or not, may or may not be included in the rendering.
  • the detected flare voxels are further emphasized by increasing intensity, color, lighting, or other effect to stand out in the rendering of the volume.
  • the image is rendered just from the flare voxels.
  • one or more planar cross-section images of the volume through the detected flare are displayed.
  • Figure 1 1 shows one embodiment for imaging the detected marine flares.
  • many detected flares exist within the volume.
  • the flares from the water column data are rendered without data from non-flares.
  • the seabed is also included in the rendering, such as a rendering the bathymetry data.
  • Lines are fit to the detected flares. Overlapping or substantially (e.g., within an average distance of a five or fewer voxels) collocated lines may be combined or used to define a flare to which the line is fit. Any line fitting or extrapolation of the flares to the seabed may be used.
  • the intersection of the lines with the seabed is highlighted, such as with a dot in the rendering.
  • Location information for the seeps may be output as well.
  • the output may be of a geographic location (i.e., global positioning coordinates) of the seep on the seafloor or location of the detected marine flare.
  • Figure 12 shows a system for marine flare detection from
  • the system includes an imaging system 80, a memory 84, a processor 82, and a display 86 in a boat 88. Additional, different, or fewer components may be provided. For example, a network or network connection is provided, such as for networking from the imaging system 80 to a remote computer or server. In another example, a user interface is provided. As another example, the boat 88 and/or imaging system 80 are not provided. Instead, a computer or server detects the marine flares from received or loaded echosounder data.
  • the processor 82, memory 84, and display 86 are part of the imaging system 80.
  • the processor 82, memory 84, and display 86 are part of a server, computer, and/or image processing system separate from the imaging system 80.
  • the processor 82, memory 84, and display 86 are a personal computer, such as desktop or laptop, a workstation, a server, a network, or combinations thereof.
  • the processor 82, display 86, and memory 84 may be provided without other components for acquiring data by scanning the water column.
  • the imaging system 80 is an echosounder.
  • a multi-beam and/or digital echosounder may be used.
  • the echosounder transmits sonar pings into the water column.
  • echoes from a fan shaped or other planar region are received in response to a given ping.
  • Beamforming or other processing is applied to sample the echoes from different locations (e.g., sample along the depth of each of various azimuthally spaced beams).
  • the resulting echosounder data represents the fan or sector shaped planar region.
  • additional planar regions are sampled.
  • echosounder data representing a volume is acquired.
  • the echosounder scans in three- dimensions rather than a planar region to acquire data representing a volume.
  • the imaging system 80 may include a global positioning system antenna and receiver.
  • the location of the boat 88 is determined so that the locations being scanned by the echosounder are known.
  • Other sensors may be included, such as an altimeter, gyroscope, compass, or gyrocompass, in order to adjust for differences in pitch, yaw, or roll caused by waves during volume scanning.
  • the memory 84 may be a graphics processing memory, a video random access memory, a random access memory, system memory, cache memory, hard drive, optical media, magnetic media, flash drive, buffer, database, combinations thereof, or other now known or later developed memory device for storing data.
  • the memory 84 is part of the imaging system 80, part of a computer associated with the processor 82, part of a database, part of another system, or a standalone device.
  • the memory 84 stores echosounder data, such as water column data and/or bathymetry data.
  • the echosounder data represents a plurality of planes and/or a volume.
  • the memory 84 may alternatively or additionally store data during processing, such as storing filtered data, possible marine flare locations, extracted feature values, actual marine flare locations, cluster labels, imaging data, and/or other information discussed herein.
  • the memory 84 or other memory is alternatively or additionally a non-transitory computer readable storage medium storing data representing instructions executable by the programmed processor 82 for detecting marine flares from echosounder data.
  • the instructions for implementing the processes, methods and/or techniques discussed herein are provided on non- transitory computer-readable storage media or memories, such as a cache, buffer, RAM, removable media, hard drive or other computer readable storage media.
  • Non-transitory computer readable storage media include various types of volatile and nonvolatile storage media.
  • the functions, acts or tasks illustrated in the figures or described herein are executed in response to one or more sets of instructions stored in or on computer readable storage media.
  • processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
  • the instructions are stored on a removable media device for reading by local or remote systems.
  • the instructions are stored in a remote location for transfer through a computer network or over telephone lines.
  • the instructions are stored within a given computer, CPU, GPU, or system.
  • the processor 82 is a general processor, central processing unit, control processor, graphics processor, digital signal processor, three- dimensional rendering processor, image processor, application specific integrated circuit, field programmable gate array, digital circuit, analog circuit, combinations thereof, or other now known or later developed device for detecting marine flares.
  • the processor 82 is a single device or multiple devices operating in serial, parallel, or separately.
  • the processor 82 may be a main processor of a computer, such as a laptop or desktop computer, or may be a processor for handling some tasks in a larger system, such as in an imaging system.
  • the processor 82 is configured by instructions, design, hardware, and/or software to perform the acts discussed herein.
  • the processor 82 is configured to perform the acts discussed above.
  • the processor 82 is configured to interpolate echosounder data to a regular (e.g., isotropic Cartesian coordinate) three- dimensional grid, apply one or more directional filters to the echosounder data, threshold the filtering results, apply a machine-learnt matrix and/or connected component analysis to distinguish false positives from actual marine flares (e.g., distinguish whether each possible flare voxel is part of a false positive or actual marine flare), and output the detected actual marine flares.
  • the output is of a location of a seep, a line fit to the voxels of the flare, the locations/coordinates of the flares, and/or an image of the flare.
  • the processor 82 may be configured to generate a user interface for receiving corrections or verification of classification results.
  • the processor 82 may be configured to surface or volume render an image representing the detected marine flares in a volume from one or more viewing directions.
  • the processor 82 may be configured to generate one or more graphics, such as location text or highlighting of a flare or seep.
  • the display 86 is a monitor, LCD, projector, plasma display, CRT, printer, or other now known or later developed devise for outputting visual information.
  • the display 86 receives images, graphics, text, quantities, or other information from the processor 82, memory 84, or imaging system 80.
  • One or more images are displayed.
  • the images are of the water column, the seabed, text, or other representation of actual marine flares or associated seeps.
  • the image includes a location based on the classification, such as a seep location extrapolated by the processor 82 from the detected marine flare.
  • the quantity may be displayed as the image without the image representation of the marine flare or seep.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

For marine flare detection from echosounder data, the echosounder data is combined to form a volume. Volume processing may be applied to identify (14) marine flares automatically. By first applying a directional filter (16), possible marine flares are identified (14) in the volume. Machine-learnt classification (22) and/or connected component analysis (24) is performed to distinguish (20) actual marine flares from non-flares in the group of possible marine flares. Any of machine-learnt classification, connected component analysis, and/or volume processing may be used alone or in various combinations. The seep location is determined by the intersection of extrapolated flares and the seafloor.

Description

AUTOMATIC MARINE FLARE OR SEEP DETECTION FROM
ECHOSOUNDER DATA
BACKGROUND
[0001] The present embodiments relate to marine flare and seafloor seep detection. When gas or oil leaks from the sea floor, bubbles are generated. The bubbles move upward from the sea floor and appear similar to a "flare" in a multibeam echosounder image. Figure 1 shows an example echosounder image with an added oval graphic 30 around a marine flare. To find a seep (i.e., root of the leaking on the sea floor) for oil and gas exploration, the flare shape (i.e., band of bubbles) is identified and located, and then extrapolated to the seafloor. To identify seepage in the sonar images, the typical workflow starts with integrated visualization of water column data and bathymetry data or other echosounder image. Then, a user scans through each echosounder image visually to find groups of bubbles, which macroscopically look like "flares" originating from the seafloor. It is, however, a tedious and time- consuming task to manually explore the hundreds of thousands echosounder images of a survey cruise to identify the flares and locate the seeps.
[0002] Automatic algorithms, which locate candidate flares, are
significantly beneficial to improve the efficiency. Such tools may save time and avoid human errors (e.g., mis-identification) and/or serve as a
complementary procedure for minimizing any mistakes. In one approach, a received data quality indicator is used to display any potential flares to facilitate flare visualization. However, the data quality may be a result of many unknown factors, and the signals that pass through the flares might not be worse than the average quality, resulting in highlighting of any bad data region or in missing true flares. In another approach, high correlation spots between pings or images are considered as potential flare candidates. This method is very computationally intensive, and more importantly, is likely to fail when the acquisition resolution is coarse and the flares among different pings do not have similar geometrical properties, such as orientation and length. BRIEF SUMMARY
[0003] By way of introduction, the preferred embodiments described below include methods, systems, instructions, and non-transitory computer readable media for marine flare detection from echosounder data. The echosounder data is combined to form a volume. Volume processing may be applied to identify marine flares automatically. By first applying a directional filter, possible marine flares are identified in the volume. Machine-learnt
classification and/or connected component analysis is performed to
distinguish actual marine flares from non-flares in the group of possible marine flares. The seep location is determined by the intersection of extrapolated flares and the seafloor. Any of machine-learnt classification, connected component analysis, and/or volume processing may be used alone or in various combinations.
[0004] In a first aspect, a method is provided for marine flare or seep detection from echosounder data. The echosounder data is obtained as a representation of a volume. A processor identifies locations for a plurality of possible marine flares or seeps represented in the volume with three- dimensional imaging processing. The processor categorizes the possible marine flares or seeps into a first group of detected marine flares or seeps and a second group of not flares or not seeps. The locations of the detected marine flares or seeps are output.
[0005] In a second aspect, a method is provided for marine flare detection from echosounder data. A processor identifies locations indicative of a marine flare from the echosounder data. The processor performs connected component analysis on the locations. The connected component analysis provides a value of a property of each of a plurality of clusters of the locations. The processor detect at least one of the clusters as the marine flare as a function of the value. The processor outputs the detection of the at least one of the clusters as the marine flare.
[0006] In a third aspect, a method is provided for marine flare detection from echosounder data. One or more features are extracted from the echosounder data. The features are applied to a machine-trained binary classifier. The machine-trained binary classifier outputs locations represented by the echosounder data corresponding to one or more marine flares upon positive detection.
[0007] The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims.
Further aspects and advantages of the invention are discussed below in conjunction with the preferred embodiments and may be later claimed independently or in combination.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The components and the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. Moreover, in the figures, like reference numerals designate corresponding parts throughout the different views.
[0009] Figure 1 illustrates one example marine flare represented in an echosounder image;
[0010] Figure 2 is a flow chart diagram of one embodiment of a method for marine flare detection from echosounder data;
[0011] Figure 3 shows an example filter shape for directional filtering;
[0012] Figure 4 shows a filter magnitude in ping and ship movement directions;
[0013] Figure 5 shows a filter magnitude in depth and ping directions;
[0014] Figure 6 is an example result from directional filtering;
[0015] Figure 7 shows examples of possible marine flares detected from directional filtering;
[0016] Figure 8 shows an example non-flare ruled out by orientation and/or size;
[0017] Figure 9 illustrates training of a machine-learnt classifier for marine flare detection;
[0018] Figure 10 illustrates application of a machine-learnt classifier for marine flare detection;
[0019] Figure 1 1 is an example output showing locations of detected marine flares; and
[0020] Figure 12 is one embodiment of a system for marine flare detection from echosounder data. DETAILED DESCRIPTION OF THE DRAWINGS AND PRESENTLY
PREFERRED EMBODIMENTS
[0021] Seep detection uses directional filtering of multi-beam echosounder data representing a volume. A 3D anisotropic filter is applied to the volume data to generate candidates containing true and false flare voxels. Connected component analysis and/or learning-based detection detect the marine flares from outputs of the filtering. The connected component analysis and/or machine-learnt classifier distinguish the true flare voxels from the false positive voxels.
[0022] Figure 2 is a flow chart diagram of one embodiment of a method for marine flare detection from echosounder data. The method is implemented by the system of Figure 12, a computer, a server, or another system. For example, a boat with a multi-beam echosounder performs act 12. A computer or processor associated with a marine flare detection service or with the survey boat performs the other acts. In other embodiments, the method is implemented in a computer network.
[0023] The method includes three-dimensional imaging processing. The echosounder data represents a volume rather than separate analysis on each two-dimensional plane or ping in a series of planes or pings. The filtering, detection, or other process includes operations in three dimensions rather than repeating two-dimensional operations. For example, the filter kernel covers more than one voxel in each of three dimensions. As another example, extracted features used for classifying extend in three dimensions. In yet another example, locations connected together in connected
component analysis are distributed in three dimensions. As yet another example, the possible marine flares and/or the detected marine flares are represented as three-dimensional objects rather than two-dimensional objects (e.g., voxels distributed in three dimensions rather than just two). In alternative embodiments, one or two-dimensional processing is used in combination with three-dimensional processing or without any three- dimensional processing. For example, marine flares are detected using a single ping image or two-dimensional fan representing a plane.
[0024] The acts are performed in the order shown (e.g., top to bottom) or other orders. Additional, different, or fewer acts may be provided. For example, the method is performed without outputting the image in act 26. As another example, acts 22 and 24 are alternatives or not provided at all. In another example, acts 16 and 18 are not provided, but other identification in act 14 is performed. As yet another example, act 22 or act 24 is provided with or without any of the other acts.
[0025] In act 12, echosounder data is obtained. A multibeam echosounder is mounted on a vessel. The echosounder data is obtained by sonar pinging and receiving echoes. In other embodiments, the echosounder data is obtained by data transfer and/or loading from memory. The data is from any stage of processing, such as raw beamformed data, filtered data, motion corrected data, scan converted data, or displayed image data.
[0026] The echosounder data is water column data. The echosounder data may or may not also include bathymetry data. The water column data and bathymetry data may be combined into one representation or frame of data representing both response from the water and from the seabed or floor. Water column and bathymetry data may be maintained separately in other embodiments. Additional, different, or fewer types of echosounder data may be provided for representing acoustic response of water or objects in water.
[0027] The echosounder data represents a volume. A volume scan may be performed, such as beamforming in response to a ping or pings in three dimensions. Alternatively, the volume is formed by combining two or more two-dimensional scans of different planes. For example, as a boat moves or scans from different locations for a survey, echosounder data representing different two-dimensional fan or sector scans is acquired. Assuming straight movement of the boat, the fans represent acoustic responses from parallel planes. Non-straight boat movement may be used.
[0028] The planar scans and locations of the boat during or for each scan are used to assemble the echosounder data representing the volume. The locations represented by the planar scans as arranged based on scan location provide the volume. In other embodiments, the locations are used to interpolate to a three-dimensional or other regularly spaced grid. In either case, the data provides voxels distributed in three dimensions. [0029] During acquisition, the vessel moves with the ocean wave arbitrarily in three dimensions. Altitude, compass, gyrocompass, gyroscope, global positioning, and/or other sensors are used to measure the vessel motion. The measurements provide pitch, roll, and yaw information. The echosounder operation may lead to varying geometry as well, such as the frequency used and/or angle of scanning. The various factors that vary between scans are accounted for in assembling the volume. The scan plane location and location of beamformed samples represented within the plane are adjusted to account for differences between the scans.
[0030] The three-dimensional echosounder data may represent any size water volume. In order to perform three-dimensional operations on the data, the volume encompasses the pings acquired in a given acquisition period, such as over minutes, hours, or days. The volume may be assembled from data acquired during different surveys or separated by hours or days. The location information provides the relative position for assembling the echosounder representation of the volume.
[0031] In act 14, a processor identifies possible marine flares represented in the volume by the echosounder data. The processor identification avoids or limits user or manual interaction. Rather than having a person scan through volume renderings or the component two-dimensional images, the processor performs the identification based on image processing or other data analysis of the echosounder data. The user may activate the process, but the identification is performed by the processor without user input of location information for any specific possible marine flare. User input of flare location is provided in other embodiments.
[0032] None, one, or more possible marine flares may be represented in the volume. The processor identifies these possible marine flares. Various artifacts from scanning, objects that are scanned, or other sources may result in the echosounder data representing a structure that appears as a flare. The identification is of various possible marine flares, not necessarily only actual marine flares. The identification process is tuned to distinguish as much as possible between actual marine flares and non-flares. To avoid false negatives, the identification is tuned to be overly inclusive, resulting in false positives (i.e., non-flares) being included with actual marine flares in the group of possible marine flares.
[0033] The possible flares are identified as a whole or by voxel. For example, the identification process identifies a collection of voxels
represented the flare. Edge detection, template fitting, or other global flare identification may be used. As another example, the identification process identifies each voxel that is possibly part of the flare. Each voxel is examined to determine whether the voxel is part of a possible flare. The examination may include surrounding voxels.
[0034] Any identification process may be used, such as template matching, edge detection, data quality measures, and/or correlation between data. In one embodiment represented by acts 16 and 18, the identification is by directional filtering. Additional, different, or fewer acts may be provided for identification of the possible marine flares.
[0035] In act 16, the echosounder data is directionally filtered. Filtering that emphasizes elongated or directional objects is performed. Emphasis is provided by reducing other information and/or increasing the desired information. Any now known or later developed directional filtering may be used. The echosounder data for the volume is filtered for structure at a given orientation. For example, structures having a vertical orientation are to be enhanced or identified. As another example, structures over a range of orientations are enhanced or identified, such as within a cone of 10 degrees from vertical. Because gas bubbles originate from the seafloor and rise to the sea surface, virtually all flares (i.e., bubble flares) are elongated along the vertical direction, with small orientation deviation. Since marine flares typically extend along the depth direction of the echosounder data, directional filtering to enhance elongated structures over a range within a limited angle (e.g., +/- 20 degrees) relative to the depth dimension may be used. In yet other examples, the directional filtering is repetitively performed for elongated structures over any number of different orientations distributed over any range (e.g., all orientations with 360-degree ranges over three dimensions with any angle step size). [0036] In three-dimensions, an elliptical cylinder may approximate the shape of the flare. Any elliptical, cylindrical, and/or other flare approximating shapes may be used. In one embodiment, a three-dimensional anisotropic filter that has the shape of a tube or cylinder when the iso-surface of the filter kernel is rendered is used. The major axis of the shape is along the depth dimension or other angle within the search range for filtering. In general, anisotropic or directional filtering provides for all pass or low pass filtering along a direction or range of directions and band or high pass filtering along other directions. For example, low pass filtering is performed along the depth dimension and filtering as a function of a Gaussian kernel is performed along dimensions orthogonal to the depth dimension (i.e., ping and vessel travel directions). Any three-dimensional bandpass filtering may be used.
[0037] Since fan shapes result from the echosounder scanning, the volume may not be rectangular. For computer processing, rectangular structure may be desired. The background locations for which echosounder data is not available in the volume may be padded, such as mirroring about the scan boundary. Other padding, such as applying random noise or speckle or template scan information without flare indication, may be used. The filter is convolved with the volume including the mirror or other padded information. The convolution is performed in the data or object domain, but may be performed in the frequency domain using Fast Fourier transform to accelerate computation.
[0038] The convolution enhances any image structures that are elliptical, cylindrical, or other elongated shape. One such example filter is illustrated in Figure 3. Figure 3 is an iso-surface plot for one example three-dimensional bandpass filter. The iso-surface plot of the filter is elongated in the depth direction y (i.e., vertical in the physical world). For the cross-section parallel to the ping plane, the filter shape follows the second derivative of the
Gaussian function as depicted in Figure 4. This second derivative of the Gaussian function is a ridge detector, but other functions may be used. Along the depth direction, lowpass filtering is applied, as depicted in Figure 5. The second order derivative of the anisotropic Gaussian function:
Figure imgf000009_0001
in the ping plane orientation x, and the ship orientation z is represented as:
Figure imgf000010_0001
where C is a normalization constant, and , σχ, σ7, σζ are, respectively, the standard deviations in the x, y, and z directions. Other Gaussian or non- Gaussian filter kernels may be used.
[0039] Figure 6 shows an example output of convolution of this example filter with the echosounder data represented in Figure 1 . The filtering enhances the flare structure as indicated generally at 32. While the drawings are two-dimensional, this enhancement occurs over three dimensions.
[0040] In act 18 of Figure 2, a threshold is applied to the results of the filtering. The processor identifies the voxels or locations associated with possible marine flares by thresholding the filter output. Any location having an intensity above the threshold or at and above the threshold is a possible marine flare. The locations for the possible marine flares are found as having high intensities after the directional filtering. In alternative embodiments, other processes than thresholding are used, such as performing edge detection, random walker, segmentation, or other data process for finding locations or groups of locations having greater intensity than other locations after directional filtering.
[0041] With a simple thresholding, the locations or voxels that have strong filter responses, as indicated generally by the ellipse 32 in Fig. 6, are determined. The high values in the filter response map represent the higher probability of flare. The flare candidates are collected by applying
thresholding to the filter response values. The resulting locations or voxels may be low pass filtered to remove outliers or voxels not located near other possible flare voxels.
[0042] The threshold value is selected to pass the true positive voxels as much as possible, so may pass false negatives as well. Figure 7 shows a slice or fan scan from volume ultrasound data showing several flares coming from the sea floor. The brighter horizontal stripe at the lower part of the image corresponds to the sea floor, and the bright vertical stripes are possible flares. The elongated structures with greater intensity are identified by thresholding. Three regions of the image include one or more generally vertical, elongated structures identified by thresholding. The locations or voxels of the greater responses to the filters making up the structures are identified. In this example, the higher intensity locations in the center ellipse represent actual marine flares. The higher intensity locations in the other ellipses represent possible marine flares that are not marine flares (i.e., false positives). In order to pass all the true flare candidates, the threshold of the three-dimensional anisotropic filter is set such that false positives are identified as well.
[0043] In act 20 of Figure 2, the processor categorizes the possible marine flares as either detected or actual marine flares or non-flares (i.e., false positives). The possible marine flares are assigned to binary classes. A third class may be provided, such as non-determinative.
[0044] The categorization is for each voxel individually or for groups of voxels. The processor categorizes without user input, but input may be provided for a semi-automated categorization. Any process may be used for categorization. Acts 22 and 24 are two examples of processes used for categorizing in act 20. Different processes or combinations of multiple of the different processes may be used, such as voting or weighted voting schemes.
[0045] In act 22, the possible marine flares are categorized with a machine-trained classifier. A learning-based approach may avoid having to fine-tune a parameter to avoid false positives, especially where tuning parameters for one situation may not be appropriate for another situation (e.g., different image quality or noise levels). The machine-learnt classifier distinguishes between false positives and actual marine flares, avoiding having to set the threshold to do so. A machine-learnt classifier may handle variance in quality, noise, or other situational factors while accurately categorizing.
[0046] The machine-learnt classifier is a binary classifier trained to classify each voxel into two classes: marine flare voxel or non-flare voxel. Any machine-learning algorithm may be used, such as a deep learning algorithm, support vector machine, neural network, sparse auto-encoding, Bayes network, decision stump, or other classifiers may be trained and applied. The machine training is supervised or semi-supervised in learning a relationship between input features and the output binary classification. Unsupervised learning of the features to use may be provided. Hierarchal or other approaches may be used.
[0047] The overall training procedure is illustrated in Figure 9. Training data (marine ultrasound volumes) is collected. For the ground truth, the actual marine flares represented in the volumes are manually annotated. The volumes are filtered to identify the possible marine flare voxels. The true flare candidates, which correspond to the ground truth, are used as positive training data. The possible flares that are false positives in the training volumes are used as negative training data.
[0048] After determining the positive training voxels and negative training voxels, features are extracted to train the binary voxel classifier to separate each possible flare voxel into true flare or non-flare. The features are extracted from the original training volume data (e.g., echosounder data prior to anisotropic filtering), the filter response map (e.g., output of the directional filtering), identified locations of possible flares, and/or other information. In one embodiment, both the source echosounder data for the volume and the thresholded information after identification are used.
[0049] Any type or types of features may be used. For example, a three- dimensional patch or kernel around a voxel is used to extract the features for the voxel. Any three-dimensional features, such as Haar-like features, steerable feature, and/or scale-invariant feature transform (SIFT) features are extracted for training. In one embodiment, a high-dimensional feature space is defined as the pattern or voxels in the patch or kernel. The voxels output by the filter response are used to create the high-dimensional feature space. The filter response map is treated as a representation of a volume for extracting features. This high-dimensional feature space is used to train the voxel classifier.
[0050] A processor trains the classifier from the extracted features. The machine-learnt classifier uses training data with ground truth to learn to classify based on an input feature vector. The resulting machine-trained classifier is a matrix for inputs, weighting, and combination to output a classification and/or probability of class membership. [0051] The same or different processor applies the machine-learnt classifier. The same or selected features are extracted for application. The training may identify features to use and features not to use. For the features to use, the features are extracted from the echosounder data and/or other data to be classified. In one embodiment, the features are from both the echosounder data and a filter response of directional information in the echosounder data. In another example, the features are additionally or alternatively extracted from results of the thresholding and/or from the echosounder data without filtering. One or more (e.g., hundreds) features are extracted.
[0052] Using the matrix or matrices, the processor inputs values for extracted features derived from an echosounder volume and outputs the classification. The machine-learnt classifier categorizes voxel by voxel. The voxels being categorized are the voxels identified as possible marine flares. In response to the input, the machine-learnt classifier outputs the binary classification as either actual marine flare or false positive for each voxel.
[0053] Figure 10 represents the application of the machine-learnt classifier. An input volume or test volume of echosounder data is obtained. The three-dimensional anisotropic filter is applied to the input volume, generating the filter response map of the input volume. After applying the threshold, the flare voxel candidates are provided to the voxel classifier.
Features are extracted from the input volume, the response maps, and/or the threshold identified candidates. The classifier determines whether the possible flare voxels are true flares.
[0054] Referring again to Figure 2, categorization is treated as a
connected component analysis in act 24. The processor performs connected component analysis on the locations of the possible marine flares. The voxels identified as possible marine flares are grouped. Any possible flare voxels directly adjacent along the three dimensions and/or other directions are connected. These connections cluster or group the voxels labeled as possible marine flares. Other distances than directly adjacent may be used in clustering. [0055] Any number of groups may be created. Two or more groups may be part of a same flare, such as where there is a gap in the flare. Low pass filtering or further directional filtering may be applied to the grouping or identified possible locations to remove such gaps. Alternatively, further clustering connects the gaps across any separation of a maximum distance. In other embodiments, the groups are left separate.
[0056] To categorize the voxels, the processor analyzes the groups. The member voxels or locations for each group or cluster are classified together. The connected possible flare voxels are categorized as being detected marine flares or false positives.
[0057] After grouping and labeling the neighboring voxels by group, the processor calculates cluster properties, such as area, orientation, bounding box, and/or eccentricity. Any characteristic of the cluster or group may be used. For example, the expected flare shape, size, and/or orientation are used. Additional, different, or fewer characteristics may be used.
[0058] The processor calculates the value or values of the property or properties of the group. The processor determines the value for the size, orientation (e.g., angle of the longest distance in the cluster), or other property.
[0059] The processor detects whether each group is a non-flare or actual marine flare. For example, interesting or actual flares should be elongated in the y direction (depth) with size (e.g., number of voxels or volume of cluster) bigger than a certain value. Those clusters that do not satisfy these criteria are indicated as false positives. Figure 8 illustrates an initial false detection that has a pixel cluster 34 elongating horizontally. The voxels of this group are categorized as non-flare. The eccentricity property is used to categorize. The processor detects the clusters as non-flares, leaving other clusters as the actual flares. Alternatively or additionally, groups with the expected size and/or orientation are categorized as actual marine flares. The processor detects the clusters as actual marine flares, leaving the other clusters as non- flares.
[0060] The categorization, whether through machine learning or connected component analysis, uses three-dimensional information. The filtering, extracted features, and/or clustering are performed in three dimensions rather than just two. In other embodiments, the filtering operations and
categorization are performed in just two dimensions. For example, the terms associated with the vessel movement direction are eliminated. The processor automatically analyzes a given fan or sector scan representing a plane to detect the marine flare or flares. The two-dimensional processing may be useful when the processor is limited in memory and/or computational power. Real-time, on-vessel flare detection and ping image analysis may be more rapidly and/or easily provided with two-dimensional analysis. When the detection is performed in real-time during an acquisition cruise, an unmanned vehicle (UMV) or remotely operated vehicle (ROV) may be dispatched to explore the details of the marine area.
[0061] Referring again to Figure 2, the detection is output in act 26. The location or locations of the detected marine flare or flares are output. Where clustering is used, the locations are output as a flare. Where voxel-by-voxel categorization is used, the locations may be output by voxel. Since the voxels are likely adjacent each other, then the result is output as a flare.
[0062] The output is an image. For example, a three-dimensional rendering of the intensities of the flare voxels is generated from any viewing angle. A surface or volume rendering may be used. The non-flare voxels, whether from possible flares or not, may or may not be included in the rendering. For example, the detected flare voxels are further emphasized by increasing intensity, color, lighting, or other effect to stand out in the rendering of the volume. As another example, the image is rendered just from the flare voxels. In yet other embodiments, one or more planar cross-section images of the volume through the detected flare are displayed.
[0063] Figure 1 1 shows one embodiment for imaging the detected marine flares. In Figure 1 1 , many detected flares exist within the volume. The flares from the water column data are rendered without data from non-flares. The seabed is also included in the rendering, such as a rendering the bathymetry data. Lines are fit to the detected flares. Overlapping or substantially (e.g., within an average distance of a five or fewer voxels) collocated lines may be combined or used to define a flare to which the line is fit. Any line fitting or extrapolation of the flares to the seabed may be used. The intersection of the lines with the seabed is highlighted, such as with a dot in the rendering.
Location information for the seeps (i.e., where the line intersects the seabed) may be output as well.
[0064] Other outputs in addition to or as an alternative to an image or images on a display may be provided. The output may be of a geographic location (i.e., global positioning coordinates) of the seep on the seafloor or location of the detected marine flare.
[0065] Figure 12 shows a system for marine flare detection from
echosounder data. The system includes an imaging system 80, a memory 84, a processor 82, and a display 86 in a boat 88. Additional, different, or fewer components may be provided. For example, a network or network connection is provided, such as for networking from the imaging system 80 to a remote computer or server. In another example, a user interface is provided. As another example, the boat 88 and/or imaging system 80 are not provided. Instead, a computer or server detects the marine flares from received or loaded echosounder data.
[0066] The processor 82, memory 84, and display 86 are part of the imaging system 80. Alternatively, the processor 82, memory 84, and display 86 are part of a server, computer, and/or image processing system separate from the imaging system 80. In other embodiments, the processor 82, memory 84, and display 86 are a personal computer, such as desktop or laptop, a workstation, a server, a network, or combinations thereof. The processor 82, display 86, and memory 84 may be provided without other components for acquiring data by scanning the water column.
[0067] The imaging system 80 is an echosounder. A multi-beam and/or digital echosounder may be used. The echosounder transmits sonar pings into the water column. Using an array of sound transducers, echoes from a fan shaped or other planar region are received in response to a given ping. Beamforming or other processing is applied to sample the echoes from different locations (e.g., sample along the depth of each of various azimuthally spaced beams). The resulting echosounder data represents the fan or sector shaped planar region. [0068] As the boat moves or at different boat positions, additional planar regions are sampled. As a result, echosounder data representing a volume is acquired. Alternatively or additionally, the echosounder scans in three- dimensions rather than a planar region to acquire data representing a volume.
[0069] The imaging system 80 may include a global positioning system antenna and receiver. The location of the boat 88 is determined so that the locations being scanned by the echosounder are known. Other sensors may be included, such as an altimeter, gyroscope, compass, or gyrocompass, in order to adjust for differences in pitch, yaw, or roll caused by waves during volume scanning.
[0070] The memory 84 may be a graphics processing memory, a video random access memory, a random access memory, system memory, cache memory, hard drive, optical media, magnetic media, flash drive, buffer, database, combinations thereof, or other now known or later developed memory device for storing data. The memory 84 is part of the imaging system 80, part of a computer associated with the processor 82, part of a database, part of another system, or a standalone device.
[0071] The memory 84 stores echosounder data, such as water column data and/or bathymetry data. The echosounder data represents a plurality of planes and/or a volume. The memory 84 may alternatively or additionally store data during processing, such as storing filtered data, possible marine flare locations, extracted feature values, actual marine flare locations, cluster labels, imaging data, and/or other information discussed herein.
[0072] The memory 84 or other memory is alternatively or additionally a non-transitory computer readable storage medium storing data representing instructions executable by the programmed processor 82 for detecting marine flares from echosounder data. The instructions for implementing the processes, methods and/or techniques discussed herein are provided on non- transitory computer-readable storage media or memories, such as a cache, buffer, RAM, removable media, hard drive or other computer readable storage media. Non-transitory computer readable storage media include various types of volatile and nonvolatile storage media. The functions, acts or tasks illustrated in the figures or described herein are executed in response to one or more sets of instructions stored in or on computer readable storage media. The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone, or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
[0073] In one embodiment, the instructions are stored on a removable media device for reading by local or remote systems. In other embodiments, the instructions are stored in a remote location for transfer through a computer network or over telephone lines. In yet other embodiments, the instructions are stored within a given computer, CPU, GPU, or system.
[0074] The processor 82 is a general processor, central processing unit, control processor, graphics processor, digital signal processor, three- dimensional rendering processor, image processor, application specific integrated circuit, field programmable gate array, digital circuit, analog circuit, combinations thereof, or other now known or later developed device for detecting marine flares. The processor 82 is a single device or multiple devices operating in serial, parallel, or separately. The processor 82 may be a main processor of a computer, such as a laptop or desktop computer, or may be a processor for handling some tasks in a larger system, such as in an imaging system. The processor 82 is configured by instructions, design, hardware, and/or software to perform the acts discussed herein.
[0075] The processor 82 is configured to perform the acts discussed above. In one embodiment, the processor 82 is configured to interpolate echosounder data to a regular (e.g., isotropic Cartesian coordinate) three- dimensional grid, apply one or more directional filters to the echosounder data, threshold the filtering results, apply a machine-learnt matrix and/or connected component analysis to distinguish false positives from actual marine flares (e.g., distinguish whether each possible flare voxel is part of a false positive or actual marine flare), and output the detected actual marine flares. The output is of a location of a seep, a line fit to the voxels of the flare, the locations/coordinates of the flares, and/or an image of the flare. [0076] The processor 82 may be configured to generate a user interface for receiving corrections or verification of classification results. The processor 82 may be configured to surface or volume render an image representing the detected marine flares in a volume from one or more viewing directions. The processor 82 may be configured to generate one or more graphics, such as location text or highlighting of a flare or seep.
[0077] The display 86 is a monitor, LCD, projector, plasma display, CRT, printer, or other now known or later developed devise for outputting visual information. The display 86 receives images, graphics, text, quantities, or other information from the processor 82, memory 84, or imaging system 80. One or more images are displayed. The images are of the water column, the seabed, text, or other representation of actual marine flares or associated seeps. As text, the image includes a location based on the classification, such as a seep location extrapolated by the processor 82 from the detected marine flare. The quantity may be displayed as the image without the image representation of the marine flare or seep.
[0078] While the invention has been described above by reference to various embodiments, it should be understood that many changes and modifications can be made without departing from the scope of the invention. It is therefore intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention.

Claims

I (WE) CLAIM:
1 . A method for marine flare or seep detection from echosounder data, the method comprising:
obtaining (12) the echosounder data as a representation of a volume; identifying (14), by a processor with three-dimensional imaging processing, locations for a plurality of possible marine flares or seeps represented in the volume;
categorizing (20), by the processor, the possible marine flares or seeps into a first group of detected marine flares or seeps and a second group of not flares or not seeps; and
outputting (26) the locations of the detected marine flares or seeps.
2. The method of claim 1 wherein identifying (14) comprises directionally filtering (16) to identify structures elongated along a depth dimension.
3. The method of claim 2 wherein directionally filtering (16) comprises filtering (16) with an elliptical, cylindrical, or elliptical cylindrical kernel shape with a major axis along the depth dimension.
4. The method of claim 2 wherein directionally filtering (16) comprises low pass filtering (16) along the depth dimension and filtering (16) as a function of a Gaussian kernel along dimensions orthogonal to the depth dimension.
5. The method of claim 4 wherein filtering (16) as a function of a
Gaussian kernel comprises filtering (16) as a function of a second derivative of an anisotropic Gaussian function.
6. The method of claim 2 wherein identifying (14) further comprises thresholding (18) results of the directional filtering (16), an output of the thresholding (18) indicating the locations of the possible marine flares.
7. The method of claim 1 wherein categorizing (20) comprises
categorizing (20) with a machine-trained classifier.
8. The method of claim 7 wherein categorizing (20) with the machine- trained classifier comprises categorizing (20) as a function of features from both the echosounder data and a filter response of directional information in the echosounder data.
9. The method of claim 7 wherein the echosounder data comprises voxels representing a three-dimensional volume, and wherein categorizing (20) with the machine-trained classifier comprises categorizing (20) voxel by voxel for the voxels identified as members of the possible marine flares.
10. The method of claim 1 wherein obtaining (12) comprises combining a plurality of two-dimensional scans for different planes into the volume.
1 1 . The method of claim 1 wherein categorizing (20) comprises performing (24) connected component analysis.
12. The method of claim 1 1 wherein performing (24) the connected component analysis comprises grouping connected voxels labeled as possible marine flares, the grouping providing a plurality of groups as the possible marine flares.
13. The method of claim 12 wherein performing (24) the connected component analysis further comprises assigning the groups to the first group or the second group based on size and direction.
14. The method of claim 1 wherein outputting (26) comprises outputting (26) the locations as intersections of the detected marine flares with a sea bottom for the volume.
15. A method for marine flare detection from echosounder data, the method comprising:
identifying (14), by a processor, locations indicative of a marine flare from the echosounder data; performing (24), by the processor, connected component analysis on the locations, the connected component analysis providing a value of a property of each of a plurality of clusters of the locations;
detecting (20), by the processor, at least one of the clusters as the marine flare, the detecting (20) being a function of the value of the property for the at least one of the clusters; and
outputting (26), by the processor, the detection of the at least one of the clusters as the marine flare.
16. The method of claim 15 wherein identifying (14) comprises:
directionally filtering (16) the echosounder data; and
applying (18) a threshold to results of the filtering (16).
17. The method of claim 15 wherein performing (24) the connected component analysis comprises clustering the locations and calculating a size and direction for each of the clusters, the property comprising the size or direction.
18. The method of claim 15 wherein detecting (20) comprises detecting (20) the marine flare or flares for the clusters based on size and direction and detecting (20) one or more other clusters as non-flares based on the size and direction.
19. A method for marine flare detection from echosounder data, the method comprising:
extracting (14) one or more features from the echosounder data;
applying (22) the features to a machine-trained binary classifier; and outputting (26), by the machine trained binary classifier, locations represented by the echosounder data corresponding to one or more marine flares upon positive detection of the one or more marine flares resulting from the applying.
20. The method of claim 19 wherein extracting the one or more features comprises: directional filtering (16) of the echosounder data;
thresholding (18) the filtered echosounder data;
extracting some of the features from results of the thresholding (18); and
extracting other of the features from the echosounder data without filtering (16).
PCT/US2015/025068 2015-04-09 2015-04-09 Automatic marine flare or seep detection from echosounder data WO2016164021A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2015/025068 WO2016164021A1 (en) 2015-04-09 2015-04-09 Automatic marine flare or seep detection from echosounder data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2015/025068 WO2016164021A1 (en) 2015-04-09 2015-04-09 Automatic marine flare or seep detection from echosounder data

Publications (1)

Publication Number Publication Date
WO2016164021A1 true WO2016164021A1 (en) 2016-10-13

Family

ID=53005684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/025068 WO2016164021A1 (en) 2015-04-09 2015-04-09 Automatic marine flare or seep detection from echosounder data

Country Status (1)

Country Link
WO (1) WO2016164021A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115346162A (en) * 2022-10-19 2022-11-15 南京优佳建筑设计有限公司 Indoor monitoring-based real-time monitoring method for water seepage of underground building wall

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013071185A1 (en) * 2011-11-11 2013-05-16 Exxonmobil Upstream Research Company Exploration method and system for detection of hydrocarbons

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013071185A1 (en) * 2011-11-11 2013-05-16 Exxonmobil Upstream Research Company Exploration method and system for detection of hydrocarbons

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
J.SCHNEIDER VON DEIMLING: "Flare imaging with multibeam systems: data processing for bubble detection at seeps", GEOCHEMISTRY GEOPHYSICS GEOSYSTEMS, vol. 8, no. 6, 6 June 2007 (2007-06-06), pages 1 - 7, XP002751540 *
S.ANAND ET AL.: "Edge detection using directional filter bank", INTERNATIONAL JOURNAL OF APPLIED INFORMATION SYSTEMS, vol. 1, no. 4, February 2012 (2012-02-01), pages 21 - 27, XP002751541 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115346162A (en) * 2022-10-19 2022-11-15 南京优佳建筑设计有限公司 Indoor monitoring-based real-time monitoring method for water seepage of underground building wall
CN115346162B (en) * 2022-10-19 2022-12-13 南京优佳建筑设计有限公司 Indoor monitoring-based real-time monitoring method for water seepage of underground building wall

Similar Documents

Publication Publication Date Title
US10809376B2 (en) Systems and methods for detecting objects in underwater environments
Preston Automated acoustic seabed classification of multibeam images of Stanton Banks
US10497128B2 (en) Method and system for sea background modeling and suppression on high-resolution remote sensing sea images
CN110945560B (en) Fetal Ultrasound Image Processing
JP6635648B2 (en) Medical image processing apparatus, medical image processing method, and classifier training method
KR101917282B1 (en) Method of identification for underwater object using sonar images and apparatus thereof
CN110663060B (en) Method, device, system and vehicle/robot for representing environmental elements
US9405959B2 (en) System and method for classification of objects from 3D reconstruction
Mishne et al. Graph-based supervised automatic target detection
Reggiannini et al. Seafloor analysis and understanding for underwater archeology
CN114821358A (en) Optical remote sensing image marine ship target extraction and identification method
Villar et al. A framework for acoustic segmentation using order statistic-constant false alarm rate in two dimensions from sidescan sonar data
Kot Review of obstacle detection systems for collision avoidance of autonomous underwater vehicles tested in a real environment
Fakiris et al. Quantification of regions of interest in swath sonar backscatter images using grey-level and shape geometry descriptors: The TargAn software
Zhao et al. Automatic detection and segmentation on gas plumes from multibeam water column images
Li et al. Object representation for multi-beam sonar image using local higher-order statistics
WO2016164021A1 (en) Automatic marine flare or seep detection from echosounder data
Del Rio Vera et al. Automatic target recognition in synthetic aperture sonar images based on geometrical feature extraction
Sadjoli et al. PCD reconstruction, object classification and pose estimation for underwater vehicles using orthogonal multibeam forward looking sonar fusion
Barngrover Automated detection of mine-like objects in side scan sonar imagery
Gerg et al. A perceptual metric prior on deep latent space improves out-of-distribution synthetic aperture sonar image classification
Shihavuddin et al. Automated detection of underwater military munitions using fusion of 2D and 2.5 D features from optical imagery
Lopera et al. Automated target recognition with SAS: Shadow and highlight-based classification
US12013456B2 (en) Multi-target detection using convex sparsity prior
Busson et al. Seismic shot gather noise localization using a multi-scale feature-fusion-based neural network

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15718704

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15718704

Country of ref document: EP

Kind code of ref document: A1