WO2010009498A1 - Method and system for autonomous habitat analysis - Google Patents

Method and system for autonomous habitat analysis Download PDF

Info

Publication number
WO2010009498A1
WO2010009498A1 PCT/AU2009/000926 AU2009000926W WO2010009498A1 WO 2010009498 A1 WO2010009498 A1 WO 2010009498A1 AU 2009000926 W AU2009000926 W AU 2009000926W WO 2010009498 A1 WO2010009498 A1 WO 2010009498A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
pixel
image
location
sub
Prior art date
Application number
PCT/AU2009/000926
Other languages
French (fr)
Inventor
Andrew Davie
Original Assignee
Commonwealth Scientific And Industrial Research Organisation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2008903721A external-priority patent/AU2008903721A0/en
Application filed by Commonwealth Scientific And Industrial Research Organisation filed Critical Commonwealth Scientific And Industrial Research Organisation
Publication of WO2010009498A1 publication Critical patent/WO2010009498A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1765Method using an image detector and processing of image signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing
    • G01N2021/1797Remote sensing in landscape, e.g. crops

Definitions

  • the present invention relates to the automated identification of materials within a region, such as by analysis of captured images of the region.
  • Preferred embodiments of the invention, as described herein, are particularly directed to autonomous mapping of marine, or benthic, habitats.
  • the invention may be applicable to other forms of automated mapping and analysis.
  • coastal and estuarine systems are coming under increasing pressure from multiple uses. These include shipping, port and industrial activities, urban development, recreational use, resource (eg oil and gas) exploration, and aqua culture. It is therefore crucial that these ecosystems are regularly monitored so that the impacts of these activities can be evaluated and sustainably managed.
  • the present invention seeks to address a number of these issues by providing methods, apparatus and systems for autonomous identification, analysis and mapping taking into account the various inaccuracies and uncertainties that may arise in the real environments in which such systems are employed.
  • the present invention provides a method of identifying the presence of materials within a predetermined region, including the steps of: capturing at least one image of a region, the image including a plurality of pixels; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; comparing spectral data of each pixel in the image with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template; and using the match data values used to identify at least one material present within the region.
  • material and “materials” are to be broadly construed within this specification, in view of the general objective of providing autonomous analysis and/or mapping in a variety of environments.
  • materials may include animal, vegetable or mineral matter, encompassing both natural and artificial substances and objects.
  • embodiments of the present invention do not seek to make an early match between captured image content and a specific corresponding material or object. Instead, information present in individual pixels of a captured image is compared with corresponding information of a wide range of possible materials, and information regarding potential matches with all of the materials is generated. This enables the data gathered from a potentially very large number of pixels, included within a potentially large number of individual images, all to be taken into account in order to identify one or more materials that may be present within the region under analysis. Accordingly, embodiments of the invention are able to mitigate the effect of reduced image quality and uncertainty by making more effective use of captured data, thereby producing useful analysis and mapping results using relatively low cost equipment.
  • each material template includes information representative of a distribution of spectral content of energy received from a sample of the corresponding material under known conditions.
  • material templates may include information relating to the spectral content of light reflected from a sample of the material under known illumination conditions.
  • the spectral content need not be limited to visible light (eg within the range of 350nm to 950nm wavelength), and some embodiments of the invention may utilise infrared and/or ultraviolet portions of the spectrum as an alternative to, or in addition to, visible light.
  • embodiments of the invention are not restricted to the use of light, and may alternatively employ high energy radiation, such as x-rays, or lower frequency radiation or other energy, such as radio waves or acoustic waves.
  • the particular spectral information utilised will depend primarily upon the characteristics of the imaging equipment used for image capture. Preferred embodiments described herein utilise commercially available trichromatic (consumer colour) digital cameras, and accordingly the main spectral range of interest is that of visible light.
  • the information in each material template may include information representative of distributions of a plurality of discrete spectral components.
  • the information may include distributions of intensity values recorded by the red, green and blue sensors of a trichromatic digital camera.
  • the recorded values may be transformed, if required, to produce derived parameters and corresponding distributions.
  • Transformed variables for example ratios between red, green and blue intensity values, may be more practical since they may be less susceptible to variation resulting from overall changes in light intensity.
  • the step of comparing spectral data of each pixel in the image preferably includes comparing captured spectral content values of each pixel with the corresponding information representative of one or more distributions of spectral content of energy received from a sample of the corresponding material under known conditions.
  • spectral information relating to visible light utilised in the benthic habitat mapping absorption and scattering of ambient light by water results in substantial reduction in overall intensity (Ze attenuation), as well as changes in spectral content.
  • Preferred embodiments of the invention therefore apply a correction to the captured spectral content values to account for differences between the environmental conditions and the known conditions under which the material template information is recorded.
  • a correction is based upon an image, captured within the mapping environment, of a test pattern having known spectral properties.
  • the method may include providing illumination when capturing images, in order to approximate the known conditions under which the template information was recorded.
  • additional cost and complexity may be incurred in providing suitable illumination means within the mapping environment.
  • each match data value represents a probability that the captured spectral content values are characteristic of the material corresponding with the template.
  • a preferred match data value is a probability that the captured spectral values were drawn from the corresponding distributions of the material template.
  • the step of using the match data values to identify at least one material present within the region may include identifying the most probable match, following analysis of all pixels of a plurality of images of the region. Additionally, the method may output a confidence level and/or a list of possible alternatives, along with their associated probabilities. That is, embodiments of the present invention are advantageously not limited to providing a single identification of a material at a location within the predetermined region, but are additionally able to provide an associated probability, along with indications of other possible materials that may be present.
  • the method also includes identifying a location of materials within the predetermined region, by implementing the steps of: dividing the region into a plurality of sub-regions; associating each pixel of each captured image with at least one sub- region; and using the match data values to identify at least one material present within each sub-region.
  • a corresponding weighted match data value may be accumulated for each pixel of each image captured.
  • the accumulated match data values may be used to identify at least one material present within each sub-region.
  • the location of each captured image within the predetermined region, and hence relative to each sub-region may be known only to a predetermined precision. This may be the case, for example, in benthic habitat mapping, wherein the precise location of a submersible vehicle may be difficult to maintain, due to the lack of availability of GPS signals and data when submerged.
  • weighted match data values are accumulated for each sub-region in accordance with the probability that the corresponding pixel represents a point within the sub- region. Such probability generally declines as a distance between the estimated location corresponding with a pixel, and any location of the sub-region increases, in accordance with the predetermined precision.
  • the weighting values are the probabilities that the pixel corresponds with a location within each sub-region such that the match data values are effectively distributed over the sub-regions in accordance with the precision with which the pixel location is known.
  • the probabilities associated with particular material templates will accumulate within each sub-region in accordance with the most likely materials present within the sub-regions, enabling effective use of all of the captured data, notwithstanding the uncertainties and inaccuracies associated with pixel information and/or positional information.
  • the present invention provides a method of identifying the location of materials within a predetermined region, including the steps of: capturing at least one image covering at least a part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; for each pixel of the image, identifying at least one material corresponding with properties of the pixel and generating, for each one of a plurality of sub- regions of the predetermined region, a location data value indicative of a probability that said at least one material is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the predetermined precision within which the location of the image is known; and using the location data values to identify the presence of said at least one material at a location corresponding with at least one of the plurality of sub- regions of the predetermined region.
  • embodiments of the invention are able to mitigate uncertainty in the location of materials corresponding with captured images by effectively "distributing" corresponding information over a number of sub-regions of the predetermined region, commensurate with the level of uncertainty.
  • material identification information combined with location data values within each sub-region may accumulate to enable identification of the most likely materials present within the sub-regions.
  • the location data values are weighting values corresponding with a probability that the corresponding pixel represents a point within each relevant sub-region.
  • the step of identifying at least one material corresponding with properties of the pixel includes comparing spectral data of the pixel with each one of a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material, in order to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template.
  • the foregoing aspects of the present invention correspond with two issues that may be need to be addressed in performing autonomous mapping and analysis.
  • the first of these issues is the need to provide improved image recognition, which is addressed in accordance with an aspect of the present invention by matching spectral information from an image against a series of material templates obtained from samples of materials anticipated to be present within the relevant environment.
  • the second issue is uncertainty of location, including time-varying positional uncertainty, which is addressed in accordance with another aspect of the invention by utilising a probabilistic method based upon statistical matching of data over sub-regions of the relevant environment.
  • the invention provides a method of identifying the presence and location of materials within a predetermined region, including the steps of: capturing at least one image covering at least part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; for each pixel of the image; comparing spectral data of the pixel with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template, and generating, for each one of a plurality of sub-regions of the predetermined region, a location data value indicative of the probability that a material corresponding with the pixel is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the pre
  • the invention provides systems and apparatus implementing the above described methods.
  • computer (Ze microprocessor) implemented apparatus including means embodied by a software and/or hardware, for implementing steps embodying the inventive methods.
  • the invention provides computer- implemented apparatus for identifying the presence and/or location of materials within a predetermined region, the apparatus including: at least one processor; a digital imaging device, operatively coupled to the processor, configured to capture digital images of portions of the predetermined region, each such captured image including a plurality of pixels; and at least one memory device operatively coupled to the processor for storage and processing of captured images, the at least one memory device further including computer executable instructions which, when executed by the processor, cause the processor to implement a method in accordance with an embodiment of the invention, as broadly described above.
  • the processor, imaging device and memory are incorporated within a submersible vehicle which may thereby be employed for the autonomous mapping and analysis of benthic habitats.
  • Figure 1 shows a microprocessor-based apparatus for identifying the presence and location of materials within a predetermined region, in accordance with a preferred embodiment of the invention
  • Figure 2 illustrates schematically a submersible vehicle for performing autonomous benthic habitat mapping, according to an embodiment of the invention
  • Figure 3 shows graphs illustrating attenuation of light, and corresponding intensity of available light at various depths of operation of the submersible vehicle of Figure 2;
  • Figure 4 shows graphs illustrating spectral reflectance of exemplary materials, and spectral response of a typical trichromatic digital camera, according to embodiments of the invention
  • Figure 5 illustrates schematically a system for performing autonomous benthic habitat mapping utilising the submersible vehicle of Figure 2;
  • Figure 6 is a schematic diagram illustrating classification and mapping of benthic habitat in accordance with an embodiment of the invention.
  • Figure 7 is a flow chart illustrating a method of identifying the presence and location of materials within a predetermined region, according to a preferred embodiment of the invention.
  • Figure 1 illustrates schematically a microprocessor-based apparatus for identifying the presence and/or location of materials within a defined region. More particularly, the apparatus 100 includes a microprocessor-based system
  • the imaging device 110 is a digital optical camera, and more particularly a digital camera of the type including a sensor consisting of a two dimensional array of trichromatic sensing elements. Images captured by such a camera consist of a corresponding two dimensional array of pixels, each of which can generally be described in terms of the red, green and blue components detected by the corresponding sensor elements.
  • the invention is not limited to this type of imaging device, and in alternative embodiments (not shown in the drawings) it is possible to utilise any imaging input that can be digitised into a suitable array of image elements (eg pixels) having associated spectral properties.
  • the microprocessor-based system 102 includes at least one processor 104, which is interfaced to, or otherwise associated with, a non-volatile memory/storage device 106.
  • the non-volatile storage 106 may be a hard disk drive, and/or may include solid-state non-volatile memory, such as read only memory (ROM), flash memory, or the like.
  • the microprocessor system 102 also includes volatile storage 108, such as random access memory (RAM) for containing program instructions and transient data relating to the operation of the system 102.
  • RAM random access memory
  • the storage device 106 maintains known program and data content relevant to the normal operation of the computer system 102.
  • the storage device 106 may contain operating system programs and data, as well as other executable application software necessary to the intended functions of the system 102.
  • the storage device 106 also contains program instructions which, when executed by the processor 104, enable the apparatus 100 to perform operations relating to the identification and/or location of materials within a specified region.
  • the memory 108 contains a body of program instructions 116, which may include instructions implementing operating system functions, peripheral interface functions, various conventional software application functions, as well as additional functions and operations embodying aspects of the present invention, as described below.
  • a peripheral interface 112 such as a USB or Firewire interface, enables the imaging device 110 to communicate with the computer system 102, such that the processor 104 is able to control operation of the imaging device 110, and image data is able to be returned and, for example, stored within memory 108.
  • the imaging device 110 is a digital camera sensitive to visible light, which may be of the type designed to capture still images, or may alternatively be a video camera, configured to capture sequences of images making up a moving picture. In either case, each individual digital image captured by the camera 110 may be processed, in the manner described hereafter, for the purposes of identifying the presence and/or location of materials within a specified region imaged by the camera 110.
  • the computer system 102 also includes at least one further input/output interface 114, providing access to other peripheral systems and devices. For example, and interface may be provided to user input and output devices, such as a keyboard, keypad, pointing and/or control device (eg a mouse or joystick) and/or one or more display devices. Additionally, or alternatively, a wireless or wired network interface may be provided, facilitating remote operation of the computer system 102.
  • user input and output devices such as a keyboard, keypad, pointing and/or control device (eg a mouse or joystick) and/or one or more display devices.
  • a wireless or wired network interface may be provided, facilitating remote operation of the computer system 102.
  • the system 102 may also be interfaced to associated vehicle systems, such as drive systems, directional systems, and location monitoring systems (eg a GPS receiver).
  • vehicle systems such as drive systems, directional systems, and location monitoring systems (eg a GPS receiver).
  • FIG. 2 there is shown schematically a submersible vehicle 200 configured to perform autonomous benthic habitat mapping.
  • the vehicle 200 is positioned above the benthic surface 202, within visual range of the camera 110, which is mounted to the vehicle 200.
  • a portion of the benthic surface, represented within the image 204, is within the field of view of the camera 110.
  • the vehicle 200 is generally operated to move over a predetermined region of the benthic surface 202, capturing images 204, which are used in subsequent processing for identifying the presence and/or location of specific materials within the predetermined benthic region.
  • the "materials" potentially of interest within the present embodiment of the invention include various animal, vegetable or mineral matter, such as sand, rocks, vegetation (eg seagrass and/or other forms of plant life), coral, crustaceans (eg crayfish, crabs etc), shellfish, starfish and so forth. Indeed, any matter having sufficiently well defined spectral properties, detectable via the camera 110, may potentially be the subject of identification.
  • a particular captured image 204 covers a portion of the benthic region 202 including two different types of material, indicated by the darker and lighter shaded regions 206, 208. In practice these may be, for example, areas of seagrass and sand respectively.
  • the submersible vehicle 200 also has mounted thereon a test pattern or colour key 210, and a mirror 212, the purpose of which will now be described with additional reference to Figure 3.
  • a graph 302 illustrating attenuation of light by water. Specifically, as light passes through a body of water, different wavelengths are absorbed and scattered at different rates, a process generally referred to as attenuation.
  • the graph 302 illustrates the proportion of light transmitted through one meter of water (Y-axis) as a function of wavelength (X-axis) over a range from 400nm (the blue end of the spectrum) to 700nm (the red end of the spectrum).
  • the amount of attenuation is dependent upon the quality and type of water, due to the presence of additional impurities which cause varying degrees of "cloudiness”.
  • the curve 304 represents extremely pure ocean, in which attenuation is primarily caused by fundamental scattering processes, which preferentially scatter light at the red end of the visible spectrum.
  • the further curves 306, 308 respectively represent mid-latitude ocean, and medium turbidity coastal waters, with additional absorption and scattering, particularly at the blue end of the spectrum, occurring due to impurities.
  • the attenuation characteristics depicted in the graph 302 are applied to light incident on the water surface, which is subsequently transmitted to the depths at which submersible vehicle 200 may be operating.
  • the further graph 310 shows a typical daylight spectrum 312, and corresponding transmitted light intensity curves 314, 316, 318, corresponding to depths of 5 meters, 10 meters and 20 meters respectively.
  • the daylight curve at 312 is exemplary only, and the actual daylight spectrum (ambient light) depends upon a variety of factors, including cloud cover, and time of day.
  • the level of attenuation, and accordingly the exact form of the transmitted spectra 314, 316, 318 depends upon the water conditions. Accordingly, a real-time measurement of the incident light spectrum is desirable, in order to compensate for the wide range of variation that may occur when performing imaging of the benthic habitat, and which may consequently result in the misidentification of materials within the environment.
  • the test pattern 210 and mirror 212 are accordingly provided for image colour compensation.
  • the test pattern 210 includes, for example, one or more bands of predetermined colour, whose spectral properties under a specified illumination are known.
  • the mirror 212 reflects an image of the test pattern 210, illuminated by the incident light, into a corner 214 of the captured image 204.
  • the distance between the test pattern 210 and the mirror 212 is approximately double the height of the camera 110 above the benthic surface 202, such that the distance over which the incident light propagates prior to imaging is approximately equal to both cases.
  • the spectral properties of the test pattern 210 under specified (eg standard) illumination conditions are known, it is accordingly possible to determine a difference between the imaged spectral properties of the test pattern, and the known spectral properties, and apply this difference to the remainder of the image 204, in order to produce an image corresponding with the benthic surface 202 under hypothetical "standard” illumination conditions.
  • An alternative approach which may eliminate the need for colour correction, would be to equip the submersible vehicle 200 with a suitable source of illumination (eg a spot-light) in order to illuminate the benthic surface 202.
  • a suitable source of illumination eg a spot-light
  • Disadvantages of this approach are the additional weight, cost and power consumption of the illumination source.
  • different materials anticipated to be present within the benthic habitat are characterised by their spectral reflectance properties.
  • the spectral reflectance of a surface is the proportion of incident light reflected from the surface, as a function of wavelength.
  • the upper graph 400 depicted in Figure 4 illustrates the spectral reflectance of a number of exemplary materials.
  • the Y-axis of the graph 400 is the proportion of light reflected, while the X-axis is wavelength between 400nm and 700nm.
  • the curve 402 represents the reflectance of white sand, which has been scaled down by a factor of 10 so as to be visible on the same axis as the other curves (ie the actual reflectance of white sand is on the order of 0.4).
  • the curve 404 is the spectral reflectance of brown mud, while the curve 406 is the spectral reflectance of the seagrass species halophila ovalis.
  • the three curves 402, 404, 406 have been determined under standard "white light" illumination.
  • Corresponding curves 408, 410, 412 are shown in dashed lines, depicting the expected spectrum at a depth of 5 meters in meter latitude oceanic water.
  • the lower graph 414 in Figure 4 depicts the spectral response of the red (416), green (418) and blue (420) sensor elements of a typical trichromatic digital camera. Each sensor element outputs a voltage that is proportional to the total intensity of light integrated over (/e effectively filtered by) the corresponding response curve 416, 418, 420. Accordingly, under known illumination conditions, the detected red, green and blue voltages will generally be characteristic of a material having a corresponding known spectral reflectance. This observation underlies a particular feature of embodiments of the present invention, whereby material "templates" are used to characterise materials anticipated to be present within a region under mapping and analysis.
  • a corresponding material template is generated, which includes characteristic spectral information of the material.
  • Spectral data of pixels within the captured images may then be compared with the various material templates, in order to generate corresponding match data values, each of which is indicative of a probability that the pixel in question is representative of the material of the template.
  • This information may then be used to identify the presence of materials within the region under study.
  • a particular characteristic of this approach is that the comparison with an individual template is not performed with the intention of establishing an image one to one mapping between the pixel and a specific material. Rather, a probability of match is established between each pixel and all of the available material templates.
  • the aggregate effects of such matching accumulated over a number of pixels, and indeed over a number of separate images, enables statistical improvement of material identification.
  • This is preferably performed, utilising a grid of cells overlayed on the benthic surface 202, is described later in more detail with reference to Figure 6.
  • an expected set of sensor readings s re d > s gr een, s b ⁇ U e, can be determined for a given material by multiplying the corresponding spectral response curves (eg 416, 418, 420) by a corresponding material reflectance ⁇ eg 402, 404, 406).
  • a spectral response curve eg 416, 418, 420
  • a corresponding material reflectance ⁇ eg 402, 404, 406
  • k is a camera-dependent constant
  • T is an exposure time
  • ⁇ red ⁇ ) is the spectral response of the red sensor
  • ⁇ a ( ⁇ ) is the spectral reflectance of a material ⁇
  • ⁇ ) is a relevant spectral attenuation function
  • Analogous equations apply for the green and blue sensors.
  • Determining the detailed functional forms of the probability density functions for the template parameters is a practical issue that will not be addressed here in detail.
  • it is necessary to associate a measure of uncertainty with all of the measured quantities which would ideally be done by performing representative measurements of known materials under a wide range of different conditions, and taking into account all possible sources of variation.
  • a suitable probability density function such as a normal (Ze Gaussian) distribution, may be fitted to measured data, or otherwise estimated, in order to complete the material templates.
  • the probability that the observed material is ⁇ is the product of the probability that each of these values was obtained from that material: where p( ⁇ ) represents prior knowledge regarding the identity of the material. For example, if an image is captured at a depth of 10m, the probability of finding species that occur at depths less than 5m would be small. Alternatively, the relative frequency of occurrence of species within a particular habitat may be known, and can be incorporated into the analysis via p( ⁇ ). In the absence of prior information, p( ⁇ ) may be set equal to 1/n, where n is the total number of materials (Ze an a priori assumption that all materials are equally likely to be observed). The foregoing result is readily derived from Bayes theorem, under the assumption that the distributions of the transformed values are independent. This assumption considerably simplifies the analysis, and thus reduces the computational load.
  • An alternative embodiment of the invention may therefore utilise a single joint probability distribution for all of the variables, rather than three independent distributions, in constructing the material templates. In this case, since all correlations are taken into account, there would be no need to utilise transformed parameters, and the sensor readings themselves could be used as the template parameters.
  • FIG. 5 there is shown schematically a system for performing autonomous benthic habitat mapping utilising the submersible vehicle 200.
  • the vehicle 200 is shown moving over the benthic surface 202. Also illustrated is the sun 504, providing a source of illumination, as well as clouds 506, representing one possible source of lighting variability. Attenuation and scattering of red, green and blue components of the ambient light is indicated by the arrows 508.
  • the vehicle 200 is depicted moving in a nominally straight path, or transect, 510.
  • transect transect
  • signals from the civilian global positioning system are strongly attenuated in water, and it is therefore not possible to use GPS location except when the vehicle 200 is surfaced. While surfacing may therefore be used to obtain periodic accurate positional information (eg to within approximately two meters of absolute position), other methods must be used to estimate location of the vehicle 200 when submerged. For example, dead reckoning based upon inertial measurements (performed, for example, using accelerometers, gyroscopes and/or compasses) may be used to track motion, in particular velocity and heading. Integration of these quantities over time enables a position estimate to be calculated. However, inaccuracy in the measurements results in increasing positional uncertainties.
  • a further approach is visual odometry, based upon processing of successive images captured at known time intervals by downward-looking cameras.
  • the tracking of objects between images provides additional velocity and heading information.
  • this form of visual odometry is computationally intensive, and therefore consumes the processing capacity of the onboard computers, and is also largely ineffective in featureless areas.
  • dead reckoning and visual odometry can provide a positional accuracy of about 5% of the distance travelled.
  • This increasing positional uncertainty, resulting in a loss of precision in the estimated location of the vehicle 200, is represented by the two dimensional Gaussian error functions 512, 514, 516 shown in Figure 5.
  • the error function 512 is tall and narrow.
  • the error functions 514, 516 become shorter and broader.
  • the error functions 512, 514, 516 are probability density functions in respect of the location of the submersible vehicle 200.
  • integrated volumes under the error functions 512, 514, 516 may be interpreted as representing the probability that the actual location of the vehicle 200 lies within a corresponding area under the error function surfaces.
  • the actual location on the benthic surface 202 of a given pixel of a captured image will vary from the expected location determined from the current estimate of position maintained by the vehicle 200.
  • the error functions 512, 514, 516 represent the probability distributions of the actual pixel location, and preferred embodiments of the present invention apply this information in order to make more effective use of the captured image data, in the presence of positional uncertainty.
  • the manner in which classification is performed and utilised in the presence of positional uncertainty is illustrated in Figure 6.
  • the benthic surface 202 is overlaid with a virtual grid 602, which divides the total region into a number of sub-regions, each of which is (in the present example) a square or rectangular cell, eg 604.
  • Every pixel of every image captured by the camera 110 corresponds with a particular point on the benthic surface 202, and therefore falls within a specific cell 604.
  • the precise cell, eg 604, containing the location corresponding with a pixel is not accurately known.
  • a finite probability exists, for example, that the actual location is in one of the adjacent cells, or even further afield.
  • the associated probability is quantified by the error functions 512, 514, 516.
  • a match data value For each pixel, a match data value, /e an associated probability value, is computed for each material template, as indicated by the boxes 606a, 606b, 606c and so forth in Figure 6. If the location of the vehicle 200 was known precisely, these values could be uniquely assigned to a single cell, corresponding with the pixel location. Data for all pixels, within all images captured by the camera 110, could then be accumulated within appropriate registers associated with each individual cell, eg 604. Specifically, in the preferred embodiments there is maintained, for each cell, a count value and a sum value corresponding with every template. For each pixel falling within a cell, the match probabilities are accumulated within the sum value, and each time this occurs the counter value is incremented. Over a large number of images and pixels, it may be anticipated that the accumulated probability (Ze sum value) corresponding with the predominant material in the cell will increase relative to the sum values associated with other materials, thereby leading to improved material identification.
  • the accumulated data may alternatively be viewed as a series of layers, eg 608a, 608b, 608c and so forth, each of which represents the probabilities in all cells associated with a particular materia.
  • These "classification layers” are, in effect, virtual “maps", indicating the presence of the corresponding material within corresponding cells. If represented visually, as in Figure 6, they may be interpreted as indicating the probability that the corresponding area on the benthic surface 202 contains the material or object represented by the layer.
  • the way in which the count and sum values associated with each layer are maintained is modified appropriately.
  • the associated match data values ie template probabilities
  • the values are distributed over all of the cells covered by the error functions eg 512, 514, 516 in proportion to the probability that the pixel lies within each particular cell.
  • this probability is readily calculated from the positional error distribution functions. Specifically, both the probability values, and the amount by which the counter value is incremented, are reduced in proportion to the probability that the pixel is actually located within each cell.
  • the classification data is distributed over the entire area of uncertainty in position, such that the total contribution of each individual pixel is independent of uncertainty in position. That is, consistently with the fact that the total probability that the vehicle 200 is within the entire region is 100%, the total contribution of classification of each individual pixel to the complete cell map is also 100%.
  • step 702 an image is captured. Correction of colours in the image, as described previously with reference to Figure 2 and 3, is performed (if required) at step 704.
  • the pixel data is compared with the various material templates in order to produce match data values (Ze probabilities, or classification data), in step 706.
  • This value must then be distributed over all of the affected cells, in accordance with the positional uncertainty.
  • an appropriate weighting is computed at step 708, utilising the positional error function. This weighting is then applied to the calculated match data values for each template, and accumulated within the corresponding template registers for the cell in questions, at steps 708 and 710. If all cells have been considered (decision step 712), then steps 708 and 710 are repeated for the next cell in the array.
  • the comparison step 706, and accumulation steps 708, 710, 712 are repeated for all materials (decision step 714). These processes are also repeated for every further pixel within the image (decision step 716).
  • mapping may be performed by capturing another image, and recommencing the entire process (decision step 718).
  • mapping of an area may be conducted autonomously, such that a very large amount of data may be efficiently collected in a manner which is not labour intensive, and which may therefore be repeated at more regular and frequent intervals then has previously been practical. Due to these statistical methods utilised by embodiments of the invention, all of the captured data may effectively be utilised in the construction of a map comprising an array of template matching data for a range of anticipated material contents, within an array of cells, or sub-regions, within the mapped region. It is possible, in principle, for mapping to be performed at a resolution that is greater than the available positional accuracy of the vehicle.
  • cells having dimensions of as little as 0.5 or one meter may be utilised, even when the best available positional accuracy (eg via GPS) is two meters. If the uncertainty in positioning is genuinely random, and does not exhibit strong correlations over time, multiple passes over a region capturing additional image data will cause the uncertainties to average out, such that the classification data associated with the predominant material within each cell will increase relative to that of other materials that are not present in the cell, or that are present in significantly smaller quantities.
  • each cell there will be available for each cell an array of accumulated match data values, ie probabilities of classifications, corresponding with the various anticipated materials for which material templates exist. These values may be interpreted variously as the probability that the corresponding area on the benthic surface 202 includes the corresponding material, or the proportion of the surface within the associated cell that is covered by the corresponding material. (The extent to which these interpretations of the statistical data are valid may depend upon the quantity of data gathered, as well as the nature of the various uncertainties.)
  • the accumulated data may be utilised to generate maps showing the location of specific materials within the overall surveyed area. It is also possible, for each cell, to identify more than one material that may be present. For example, if the accumulated match data values associated with two different material templates in a particular cell are approximately equal, then those two materials may be present with approximately equal probability, or in approximately equal quantity. A list of all possible material present in a cell, along with associated probabilities or proportions may also be generated. In addition, data from groups of cells may be utilised in order to improve classification and mapping.
  • the data for a particular cell indicates comparable probabilities of the presence, for example, of a variety of seagrass and of sand, and the surrounding cells all indicate a high probability of presence of the seagrass, then it may be reasonable to conclude that the initial cell should be classified as containing seagrass.
  • transects may be used as an alternative means of addressing a lack of availability of accurate underwater positional information.
  • a transect involves the submersible vehicle 200 making a substantially straight-line underwater pass, capturing images at regular intervals, along with corresponding time stamps.
  • the vehicle 200 may surface in order to establish a GPS fix. This enables a straight line interpolation to be performed between the starting and finishing GPS positions, and each of the time stamped images can be assigned an interpellated position along the transect. All of the images may then be post processed, in the manner previously described, in order to develop the classification map.
  • this method requires the provision of sufficient storage capacity within the computer system 102 of the submersible vehicle 200 to store all of the captured images for processing.
  • images may be captured and stored, for example along a transect as described above, and saved for later processing. That is, all of the image analysis may be performed at a subsequent time, and need not necessarily be performed in real-time, on board a submersible vehicle.
  • the vehicle may also be provided with a wireless transmitter, such that when surfacing for a positional fix it is also able to transmit stored images and other data to a remote location, such as a computer on board a supervising vessel.
  • a remote location such as a computer on board a supervising vessel.
  • Yet another alternative for such communications is a permanent wired connection between a submersible vehicle 200 and a surface vessel, such as a ship, for example via a long, lightweight cable.

Abstract

A method and apparatus for identifying the presence of materials within a predetermined region including capturing at least one digital image of a region and providing a number of material templates, each of which includes spectral information characteristic of a corresponding material, and identifying at least one material present within the region by comparing spectral data of each pixel in the image with each of the material templates to generate corresponding match data values indicative of the probability that the pixel is representative of a material matching the corresponding template. A method is also disclosed for locating materials within an image including for each pixel, identifying at least one material corresponding with properties of the pixel and generating for each one of a number of sub-regions of the predetermined region, a location data value indicative of a probability that the material is present with the sub-region. The invention is particularly applicable for automated mapping using an autonomous submersible vehicle equipped with a digital camera.

Description

METHOD AND SYSTEM FOR AUTONOMOUS HABITAT ANALYSIS FIELD OF THE INVENTION
The present invention relates to the automated identification of materials within a region, such as by analysis of captured images of the region. Preferred embodiments of the invention, as described herein, are particularly directed to autonomous mapping of marine, or benthic, habitats. However, the invention may be applicable to other forms of automated mapping and analysis. BACKGROUND OF THE INVENTION
There are many applications for automated image analysis in general, and for the autonomous analysis of images of regions of the environment in particular. For example, many natural environments are subject to ongoing change as a result of various processes and activities, including human activities such as industrialisation, urban development, recreational uses, and so forth. Monitoring environmental changes, particularly when these may be affected by human activities, is particularly important for the understanding and/or control of human impact upon the environment. Unfortunately, such monitoring can require intensive manual intervention, and is therefore expensive and time consuming.
As a particular example, coastal and estuarine systems are coming under increasing pressure from multiple uses. These include shipping, port and industrial activities, urban development, recreational use, resource (eg oil and gas) exploration, and aqua culture. It is therefore crucial that these ecosystems are regularly monitored so that the impacts of these activities can be evaluated and sustainably managed.
Traditional monitoring approaches are labour intensive, relying upon divers conducting surveys, or manual classification of video and still images of the monitored environments. Due to the expense and time required, such activities are generally undertaken infrequently and irregularly. Furthermore, manual surveys and classification are subject to errors and/or inconsistencies. The information obtained from such surveys is accordingly insufficient for management authorities to assess and respond to impacts, and to monitor developments in the benthic environment on seasonal or annual time scales.
There is accordingly a need for improved automated approaches to environmental monitoring which are sufficiently cost-effective, accurate and efficient to enable useful monitoring to be undertaken with significantly increased frequency and regularity.
The case of autonomous analysis and/or mapping of marine environments, such as benthic habitats, is exemplary of many difficult issues that may need to be addressed in providing effective automated monitoring. This environment presents significant challenges. For example, it is inevitable that submersible vehicles are required for the gathering of raw data. The cost of purchasing, managing and maintaining such vehicles is therefore a concern, and any autonomous analysis system should preferably be able to operate effectively using relatively low cost equipment.
Further particular difficulties encountered within the marine environment include issues relating to the quality and quantity of available light for imaging, due to attenuation (eg scattering) of ambient light by water, difficulties of maintaining accurate location due to currents and waves, along with the unavailability of GPS signals under water, and additional causes of poor visibility, such as the presence of sand and silt.
The present invention seeks to address a number of these issues by providing methods, apparatus and systems for autonomous identification, analysis and mapping taking into account the various inaccuracies and uncertainties that may arise in the real environments in which such systems are employed. SUMMARY OF THE INVENTION
In one aspect, the present invention provides a method of identifying the presence of materials within a predetermined region, including the steps of: capturing at least one image of a region, the image including a plurality of pixels; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; comparing spectral data of each pixel in the image with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template; and using the match data values used to identify at least one material present within the region.
It should be understood that the words "material" and "materials" are to be broadly construed within this specification, in view of the general objective of providing autonomous analysis and/or mapping in a variety of environments. In particular, "materials" may include animal, vegetable or mineral matter, encompassing both natural and artificial substances and objects.
Advantageously, and in contrast to prior art approaches, embodiments of the present invention do not seek to make an early match between captured image content and a specific corresponding material or object. Instead, information present in individual pixels of a captured image is compared with corresponding information of a wide range of possible materials, and information regarding potential matches with all of the materials is generated. This enables the data gathered from a potentially very large number of pixels, included within a potentially large number of individual images, all to be taken into account in order to identify one or more materials that may be present within the region under analysis. Accordingly, embodiments of the invention are able to mitigate the effect of reduced image quality and uncertainty by making more effective use of captured data, thereby producing useful analysis and mapping results using relatively low cost equipment.
Particularly preferred embodiments of the invention are implemented via the capture of images using a camera fixed to a submersible vehicle, in order to perform mapping of a benthic habitats. However, it will be appreciated that the invention has other applications, such as (but not limited to) aerial mapping. In a preferred embodiment, each material template includes information representative of a distribution of spectral content of energy received from a sample of the corresponding material under known conditions. For example, material templates may include information relating to the spectral content of light reflected from a sample of the material under known illumination conditions. However, the spectral content need not be limited to visible light (eg within the range of 350nm to 950nm wavelength), and some embodiments of the invention may utilise infrared and/or ultraviolet portions of the spectrum as an alternative to, or in addition to, visible light. Furthermore, embodiments of the invention are not restricted to the use of light, and may alternatively employ high energy radiation, such as x-rays, or lower frequency radiation or other energy, such as radio waves or acoustic waves. The particular spectral information utilised will depend primarily upon the characteristics of the imaging equipment used for image capture. Preferred embodiments described herein utilise commercially available trichromatic (consumer colour) digital cameras, and accordingly the main spectral range of interest is that of visible light.
The information in each material template may include information representative of distributions of a plurality of discrete spectral components. For example, the information may include distributions of intensity values recorded by the red, green and blue sensors of a trichromatic digital camera. The recorded values may be transformed, if required, to produce derived parameters and corresponding distributions. Transformed variables, for example ratios between red, green and blue intensity values, may be more practical since they may be less susceptible to variation resulting from overall changes in light intensity.
The step of comparing spectral data of each pixel in the image preferably includes comparing captured spectral content values of each pixel with the corresponding information representative of one or more distributions of spectral content of energy received from a sample of the corresponding material under known conditions.
One practical difficulty that may arise is that the conditions under which images are captured may differ significantly from the known conditions under which the material template data is recorded. For example, in the case of spectral information relating to visible light utilised in the benthic habitat mapping, absorption and scattering of ambient light by water results in substantial reduction in overall intensity (Ze attenuation), as well as changes in spectral content. Preferred embodiments of the invention therefore apply a correction to the captured spectral content values to account for differences between the environmental conditions and the known conditions under which the material template information is recorded. In one embodiment, a correction is based upon an image, captured within the mapping environment, of a test pattern having known spectral properties. Alternatively, the method may include providing illumination when capturing images, in order to approximate the known conditions under which the template information was recorded. Of course, additional cost and complexity may be incurred in providing suitable illumination means within the mapping environment.
In preferred embodiments, each match data value represents a probability that the captured spectral content values are characteristic of the material corresponding with the template. Expressed in statistical terms, a preferred match data value is a probability that the captured spectral values were drawn from the corresponding distributions of the material template.
The step of using the match data values to identify at least one material present within the region may include identifying the most probable match, following analysis of all pixels of a plurality of images of the region. Additionally, the method may output a confidence level and/or a list of possible alternatives, along with their associated probabilities. That is, embodiments of the present invention are advantageously not limited to providing a single identification of a material at a location within the predetermined region, but are additionally able to provide an associated probability, along with indications of other possible materials that may be present. In particularly preferred embodiments, the method also includes identifying a location of materials within the predetermined region, by implementing the steps of: dividing the region into a plurality of sub-regions; associating each pixel of each captured image with at least one sub- region; and using the match data values to identify at least one material present within each sub-region.
For example, for each sub-region and each material template, a corresponding weighted match data value may be accumulated for each pixel of each image captured. Advantageously, at any time, and in particular at the conclusion of imaging and processing, the accumulated match data values may be used to identify at least one material present within each sub-region. In some embodiments, the location of each captured image within the predetermined region, and hence relative to each sub-region, may be known only to a predetermined precision. This may be the case, for example, in benthic habitat mapping, wherein the precise location of a submersible vehicle may be difficult to maintain, due to the lack of availability of GPS signals and data when submerged. While other positioning methods may be used, such as dead reckoning and/or visual odometry, location errors will tend to accumulate, and accordingly the precision within which the location of the vehicle is known will tend to decrease with time and/or distance. In order to mitigate location uncertainty, in preferred embodiments weighted match data values are accumulated for each sub-region in accordance with the probability that the corresponding pixel represents a point within the sub- region. Such probability generally declines as a distance between the estimated location corresponding with a pixel, and any location of the sub-region increases, in accordance with the predetermined precision.
In particularly preferred embodiments, the weighting values are the probabilities that the pixel corresponds with a location within each sub-region such that the match data values are effectively distributed over the sub-regions in accordance with the precision with which the pixel location is known. Advantageously, over a large number of pixels and a large number of captured images, the probabilities associated with particular material templates will accumulate within each sub-region in accordance with the most likely materials present within the sub-regions, enabling effective use of all of the captured data, notwithstanding the uncertainties and inaccuracies associated with pixel information and/or positional information.
In another aspect, the present invention provides a method of identifying the location of materials within a predetermined region, including the steps of: capturing at least one image covering at least a part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; for each pixel of the image, identifying at least one material corresponding with properties of the pixel and generating, for each one of a plurality of sub- regions of the predetermined region, a location data value indicative of a probability that said at least one material is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the predetermined precision within which the location of the image is known; and using the location data values to identify the presence of said at least one material at a location corresponding with at least one of the plurality of sub- regions of the predetermined region.
Advantageously, in this aspect, embodiments of the invention are able to mitigate uncertainty in the location of materials corresponding with captured images by effectively "distributing" corresponding information over a number of sub-regions of the predetermined region, commensurate with the level of uncertainty. Over a large number of pixels and a large number of captured images, material identification information combined with location data values within each sub-region may accumulate to enable identification of the most likely materials present within the sub-regions.
In a particularly preferred embodiment, the location data values are weighting values corresponding with a probability that the corresponding pixel represents a point within each relevant sub-region.
Preferably, the step of identifying at least one material corresponding with properties of the pixel includes comparing spectral data of the pixel with each one of a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material, in order to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template.
It will accordingly be appreciated that the foregoing aspects of the present invention correspond with two issues that may be need to be addressed in performing autonomous mapping and analysis. The first of these issues is the need to provide improved image recognition, which is addressed in accordance with an aspect of the present invention by matching spectral information from an image against a series of material templates obtained from samples of materials anticipated to be present within the relevant environment. The second issue is uncertainty of location, including time-varying positional uncertainty, which is addressed in accordance with another aspect of the invention by utilising a probabilistic method based upon statistical matching of data over sub-regions of the relevant environment.
It is particularly advantageous that both of the aforementioned aspects be combined in a single embodiment, in order to mitigate the two major sources of uncertainty in the raw data, namely uncertainty regarding the correct identification of materials, and uncertainty regarding the location of the captured image.
Accordingly, in yet another aspect the invention provides a method of identifying the presence and location of materials within a predetermined region, including the steps of: capturing at least one image covering at least part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; for each pixel of the image; comparing spectral data of the pixel with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template, and generating, for each one of a plurality of sub-regions of the predetermined region, a location data value indicative of the probability that a material corresponding with the pixel is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the predetermined precision within which the location of the image is known; and combining the location data values and the match data values to identify at least one material present within the predetermined region, at a location corresponding with at least one of the plurality of sub-regions.
In further aspects, the invention provides systems and apparatus implementing the above described methods. In particular, computer (Ze microprocessor) implemented apparatus is provided including means embodied by a software and/or hardware, for implementing steps embodying the inventive methods.
In a particular further aspect, the invention provides computer- implemented apparatus for identifying the presence and/or location of materials within a predetermined region, the apparatus including: at least one processor; a digital imaging device, operatively coupled to the processor, configured to capture digital images of portions of the predetermined region, each such captured image including a plurality of pixels; and at least one memory device operatively coupled to the processor for storage and processing of captured images, the at least one memory device further including computer executable instructions which, when executed by the processor, cause the processor to implement a method in accordance with an embodiment of the invention, as broadly described above.
In a particularly preferred embodiment, the processor, imaging device and memory are incorporated within a submersible vehicle which may thereby be employed for the autonomous mapping and analysis of benthic habitats. Further preferred features and advantages of the present invention will be apparent to those skilled in the art from the following description of preferred embodiments of the invention, which should not be considered to be limiting of the scope of the invention as defined in any of the preceding statements, or in the claims appended hereto. BRIEF DESCRIPTION OF THE DRAWINGS
Preferred embodiments of the invention are described with reference to the accompanying drawings, in which like reference numerals refer to like features, and wherein:
Figure 1 shows a microprocessor-based apparatus for identifying the presence and location of materials within a predetermined region, in accordance with a preferred embodiment of the invention; Figure 2 illustrates schematically a submersible vehicle for performing autonomous benthic habitat mapping, according to an embodiment of the invention;
Figure 3 shows graphs illustrating attenuation of light, and corresponding intensity of available light at various depths of operation of the submersible vehicle of Figure 2;
Figure 4 shows graphs illustrating spectral reflectance of exemplary materials, and spectral response of a typical trichromatic digital camera, according to embodiments of the invention; Figure 5 illustrates schematically a system for performing autonomous benthic habitat mapping utilising the submersible vehicle of Figure 2;
Figure 6 is a schematic diagram illustrating classification and mapping of benthic habitat in accordance with an embodiment of the invention; and
Figure 7 is a flow chart illustrating a method of identifying the presence and location of materials within a predetermined region, according to a preferred embodiment of the invention. DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Figure 1 illustrates schematically a microprocessor-based apparatus for identifying the presence and/or location of materials within a defined region. More particularly, the apparatus 100 includes a microprocessor-based system
102 which is interfaced to an imaging device 110. In the apparatus 100 shown in
Figure 1 , the imaging device 110 is a digital optical camera, and more particularly a digital camera of the type including a sensor consisting of a two dimensional array of trichromatic sensing elements. Images captured by such a camera consist of a corresponding two dimensional array of pixels, each of which can generally be described in terms of the red, green and blue components detected by the corresponding sensor elements. However, the invention is not limited to this type of imaging device, and in alternative embodiments (not shown in the drawings) it is possible to utilise any imaging input that can be digitised into a suitable array of image elements (eg pixels) having associated spectral properties.
The microprocessor-based system 102 includes at least one processor 104, which is interfaced to, or otherwise associated with, a non-volatile memory/storage device 106. The non-volatile storage 106 may be a hard disk drive, and/or may include solid-state non-volatile memory, such as read only memory (ROM), flash memory, or the like. The microprocessor system 102 also includes volatile storage 108, such as random access memory (RAM) for containing program instructions and transient data relating to the operation of the system 102. In a conventional configuration, the storage device 106 maintains known program and data content relevant to the normal operation of the computer system 102. For example, the storage device 106 may contain operating system programs and data, as well as other executable application software necessary to the intended functions of the system 102. In the embodiment shown, the storage device 106 also contains program instructions which, when executed by the processor 104, enable the apparatus 100 to perform operations relating to the identification and/or location of materials within a specified region. In use, accordingly, the memory 108 contains a body of program instructions 116, which may include instructions implementing operating system functions, peripheral interface functions, various conventional software application functions, as well as additional functions and operations embodying aspects of the present invention, as described below. A peripheral interface 112, such as a USB or Firewire interface, enables the imaging device 110 to communicate with the computer system 102, such that the processor 104 is able to control operation of the imaging device 110, and image data is able to be returned and, for example, stored within memory 108. As noted above, and described herein, the imaging device 110 is a digital camera sensitive to visible light, which may be of the type designed to capture still images, or may alternatively be a video camera, configured to capture sequences of images making up a moving picture. In either case, each individual digital image captured by the camera 110 may be processed, in the manner described hereafter, for the purposes of identifying the presence and/or location of materials within a specified region imaged by the camera 110.
The computer system 102 also includes at least one further input/output interface 114, providing access to other peripheral systems and devices. For example, and interface may be provided to user input and output devices, such as a keyboard, keypad, pointing and/or control device (eg a mouse or joystick) and/or one or more display devices. Additionally, or alternatively, a wireless or wired network interface may be provided, facilitating remote operation of the computer system 102. When the apparatus 100 is incorporated within an autonomous device, such as will be described below with reference to Figure 2, the system 102 may also be interfaced to associated vehicle systems, such as drive systems, directional systems, and location monitoring systems (eg a GPS receiver). Various options for implementation of apparatus 100 falling within the scope of the present invention will be readily apparent to persons skilled in the art of digital and computer systems engineering.
Turning now to Figure 2, there is shown schematically a submersible vehicle 200 configured to perform autonomous benthic habitat mapping. The vehicle 200 is positioned above the benthic surface 202, within visual range of the camera 110, which is mounted to the vehicle 200. A portion of the benthic surface, represented within the image 204, is within the field of view of the camera 110. The vehicle 200 is generally operated to move over a predetermined region of the benthic surface 202, capturing images 204, which are used in subsequent processing for identifying the presence and/or location of specific materials within the predetermined benthic region. The "materials" potentially of interest within the present embodiment of the invention include various animal, vegetable or mineral matter, such as sand, rocks, vegetation (eg seagrass and/or other forms of plant life), coral, crustaceans (eg crayfish, crabs etc), shellfish, starfish and so forth. Indeed, any matter having sufficiently well defined spectral properties, detectable via the camera 110, may potentially be the subject of identification.
As shown in Figure 2, a particular captured image 204 covers a portion of the benthic region 202 including two different types of material, indicated by the darker and lighter shaded regions 206, 208. In practice these may be, for example, areas of seagrass and sand respectively. The submersible vehicle 200 also has mounted thereon a test pattern or colour key 210, and a mirror 212, the purpose of which will now be described with additional reference to Figure 3. In Figure 3 there is shown a graph 302 illustrating attenuation of light by water. Specifically, as light passes through a body of water, different wavelengths are absorbed and scattered at different rates, a process generally referred to as attenuation. The graph 302 illustrates the proportion of light transmitted through one meter of water (Y-axis) as a function of wavelength (X-axis) over a range from 400nm (the blue end of the spectrum) to 700nm (the red end of the spectrum). The amount of attenuation is dependent upon the quality and type of water, due to the presence of additional impurities which cause varying degrees of "cloudiness". The curve 304 represents extremely pure ocean, in which attenuation is primarily caused by fundamental scattering processes, which preferentially scatter light at the red end of the visible spectrum. The further curves 306, 308 respectively represent mid-latitude ocean, and medium turbidity coastal waters, with additional absorption and scattering, particularly at the blue end of the spectrum, occurring due to impurities. The attenuation characteristics depicted in the graph 302 are applied to light incident on the water surface, which is subsequently transmitted to the depths at which submersible vehicle 200 may be operating. The further graph 310 shows a typical daylight spectrum 312, and corresponding transmitted light intensity curves 314, 316, 318, corresponding to depths of 5 meters, 10 meters and 20 meters respectively. It should be understood, however, that the daylight curve at 312 is exemplary only, and the actual daylight spectrum (ambient light) depends upon a variety of factors, including cloud cover, and time of day. Similarly, the level of attenuation, and accordingly the exact form of the transmitted spectra 314, 316, 318, depends upon the water conditions. Accordingly, a real-time measurement of the incident light spectrum is desirable, in order to compensate for the wide range of variation that may occur when performing imaging of the benthic habitat, and which may consequently result in the misidentification of materials within the environment.
The test pattern 210 and mirror 212 are accordingly provided for image colour compensation. The test pattern 210 includes, for example, one or more bands of predetermined colour, whose spectral properties under a specified illumination are known. The mirror 212 reflects an image of the test pattern 210, illuminated by the incident light, into a corner 214 of the captured image 204. Preferably, the distance between the test pattern 210 and the mirror 212 is approximately double the height of the camera 110 above the benthic surface 202, such that the distance over which the incident light propagates prior to imaging is approximately equal to both cases. Since the spectral properties of the test pattern 210 under specified (eg standard) illumination conditions are known, it is accordingly possible to determine a difference between the imaged spectral properties of the test pattern, and the known spectral properties, and apply this difference to the remainder of the image 204, in order to produce an image corresponding with the benthic surface 202 under hypothetical "standard" illumination conditions.
An alternative approach, which may eliminate the need for colour correction, would be to equip the submersible vehicle 200 with a suitable source of illumination (eg a spot-light) in order to illuminate the benthic surface 202. Disadvantages of this approach are the additional weight, cost and power consumption of the illumination source.
In accordance with preferred embodiments of the present invention, different materials anticipated to be present within the benthic habitat are characterised by their spectral reflectance properties. In particular, the spectral reflectance of a surface is the proportion of incident light reflected from the surface, as a function of wavelength. The upper graph 400 depicted in Figure 4 illustrates the spectral reflectance of a number of exemplary materials. The Y-axis of the graph 400 is the proportion of light reflected, while the X-axis is wavelength between 400nm and 700nm. The curve 402 represents the reflectance of white sand, which has been scaled down by a factor of 10 so as to be visible on the same axis as the other curves (ie the actual reflectance of white sand is on the order of 0.4). The curve 404 is the spectral reflectance of brown mud, while the curve 406 is the spectral reflectance of the seagrass species halophila ovalis. The three curves 402, 404, 406 have been determined under standard "white light" illumination. Corresponding curves 408, 410, 412 are shown in dashed lines, depicting the expected spectrum at a depth of 5 meters in meter latitude oceanic water.
The lower graph 414 in Figure 4 depicts the spectral response of the red (416), green (418) and blue (420) sensor elements of a typical trichromatic digital camera. Each sensor element outputs a voltage that is proportional to the total intensity of light integrated over (/e effectively filtered by) the corresponding response curve 416, 418, 420. Accordingly, under known illumination conditions, the detected red, green and blue voltages will generally be characteristic of a material having a corresponding known spectral reflectance. This observation underlies a particular feature of embodiments of the present invention, whereby material "templates" are used to characterise materials anticipated to be present within a region under mapping and analysis.
In particular, for every material of interest (eg plants, sand, mud, rocks and so forth) a corresponding material template is generated, which includes characteristic spectral information of the material. Spectral data of pixels within the captured images may then be compared with the various material templates, in order to generate corresponding match data values, each of which is indicative of a probability that the pixel in question is representative of the material of the template. This information may then be used to identify the presence of materials within the region under study. A particular characteristic of this approach is that the comparison with an individual template is not performed with the intention of establishing an image one to one mapping between the pixel and a specific material. Rather, a probability of match is established between each pixel and all of the available material templates. Advantageously, the aggregate effects of such matching accumulated over a number of pixels, and indeed over a number of separate images, enables statistical improvement of material identification. The manner in which this is preferably performed, utilising a grid of cells overlayed on the benthic surface 202, is described later in more detail with reference to Figure 6.
Turning now to the construction of suitable material templates, it should be appreciated that under realistic conditions natural variations will occur in the detected spectral response of individual materials. That is, a material with an "ideal" expected characteristic spectral response will, in practice, exhibit variability from this response due to environmental conditions, including lighting, water motion, cloudiness, noise, and so forth. Appropriate consideration of this variability can result in improved outcomes, by enabling the statistical approach to matching employed by embodiments of the present invention. In particular, an expected set of sensor readings, sred> sgreen, sbιUe, can be determined for a given material by multiplying the corresponding spectral response curves (eg 416, 418, 420) by a corresponding material reflectance {eg 402, 404, 406). This may be stated formally, using the red sensor as an example, as follows:
Figure imgf000017_0001
where k is a camera-dependent constant, T is an exposure time, σred{λ) is the spectral response of the red sensor, φa(λ) is the spectral reflectance of a material α, φ{λ) is a relevant spectral attenuation function, and φa{λ)=φa{λ)φ{λ) is the resulting reflected spectrum received at the camera. Analogous equations apply for the green and blue sensors.
In practice, however, actual sensor readings for the material will vary around these expected values, in accordance with a corresponding probability density function (or "distribution"). Thus, for a particular material denoted by the symbol α the red sensor reading sred will be distributed according to a function fα(Sred)- This distribution, along with corresponding distributions for the green and blue sensor readings, constitute one possible choice of a material template.
It has been found to be preferable in practice not to use the direct sensor readings as template parameters. As a practical matter, due to factors such as the variable angle and distance of the benthic surface from the camera, image brightness may vary significantly over even a single image. It is therefore preferable to use parameters derived from the sensor reading parameters sred> Sgreen, Sbiue, that are less sensitive to variations. One particular alternative template that has been employed, and found to provide improved performance, is that based upon the transformed values:
Figure imgf000017_0002
-green Sred ~*~ S green ^ S blue
*3 = having corresponding probability density functions (Ze template) for material α of g«(θi), g«(θ2) and g«3).
Determining the detailed functional forms of the probability density functions for the template parameters is a practical issue that will not be addressed here in detail. In general, it is necessary to associate a measure of uncertainty with all of the measured quantities, which would ideally be done by performing representative measurements of known materials under a wide range of different conditions, and taking into account all possible sources of variation. A suitable probability density function, such as a normal (Ze Gaussian) distribution, may be fitted to measured data, or otherwise estimated, in order to complete the material templates. Without suggesting that the detailed determination of the necessary parameters is by any means an insubstantial task, it is clearly achievable by expenditure of the necessary time and resources.
The process of template matching, Ze comparing pixel data with material templates in order to produce match data values corresponding with the probability of a match, will now be described. In particular, assuming that an image has been captured, and colour correction performed as required, a specific pixel may be described by its characteristic measured parameters sred. sgreen. Sbiue-
Denoting the transformed measured values for the pixel as 0, , θ2 , 03 , the probability that the observed material is α is the product of the probability that each of these values was obtained from that material:
Figure imgf000018_0001
where p(α) represents prior knowledge regarding the identity of the material. For example, if an image is captured at a depth of 10m, the probability of finding species that occur at depths less than 5m would be small. Alternatively, the relative frequency of occurrence of species within a particular habitat may be known, and can be incorporated into the analysis via p(α). In the absence of prior information, p(α) may be set equal to 1/n, where n is the total number of materials (Ze an a priori assumption that all materials are equally likely to be observed). The foregoing result is readily derived from Bayes theorem, under the assumption that the distributions of the transformed values are independent. This assumption considerably simplifies the analysis, and thus reduces the computational load.
It will, however, be reasonably apparent to the skilled person that some correlation exists between the measured sensor readings, since some sources of variation in all three sensor outputs have a common source. An alternative embodiment of the invention may therefore utilise a single joint probability distribution for all of the variables, rather than three independent distributions, in constructing the material templates. In this case, since all correlations are taken into account, there would be no need to utilise transformed parameters, and the sensor readings themselves could be used as the template parameters.
Turning now to Figure 5, there is shown schematically a system for performing autonomous benthic habitat mapping utilising the submersible vehicle 200. The vehicle 200 is shown moving over the benthic surface 202. Also illustrated is the sun 504, providing a source of illumination, as well as clouds 506, representing one possible source of lighting variability. Attenuation and scattering of red, green and blue components of the ambient light is indicated by the arrows 508. The vehicle 200 is depicted moving in a nominally straight path, or transect, 510. However, as a practical matter, difficulties may arise in maintaining accurate positional information of the submersible vehicle 200. In particular, signals from the civilian global positioning system (GPS) are strongly attenuated in water, and it is therefore not possible to use GPS location except when the vehicle 200 is surfaced. While surfacing may therefore be used to obtain periodic accurate positional information (eg to within approximately two meters of absolute position), other methods must be used to estimate location of the vehicle 200 when submerged. For example, dead reckoning based upon inertial measurements (performed, for example, using accelerometers, gyroscopes and/or compasses) may be used to track motion, in particular velocity and heading. Integration of these quantities over time enables a position estimate to be calculated. However, inaccuracy in the measurements results in increasing positional uncertainties. A further approach is visual odometry, based upon processing of successive images captured at known time intervals by downward-looking cameras. The tracking of objects between images provides additional velocity and heading information. However, this form of visual odometry is computationally intensive, and therefore consumes the processing capacity of the onboard computers, and is also largely ineffective in featureless areas.
Nonetheless, in one exemplary submersible vehicle 200, it has been found that dead reckoning and visual odometry can provide a positional accuracy of about 5% of the distance travelled. This increasing positional uncertainty, resulting in a loss of precision in the estimated location of the vehicle 200, is represented by the two dimensional Gaussian error functions 512, 514, 516 shown in Figure 5. Initially, for example following surfacing in order to obtain a GPS fix, the uncertainty in position is small, and the error function 512 is tall and narrow. Over time and distance, the positional uncertainty increases, and the error functions 514, 516 become shorter and broader. In particular, the error functions 512, 514, 516 are probability density functions in respect of the location of the submersible vehicle 200. With appropriate normalisation, integrated volumes under the error functions 512, 514, 516 may be interpreted as representing the probability that the actual location of the vehicle 200 lies within a corresponding area under the error function surfaces.
It will therefore be understood that the actual location on the benthic surface 202 of a given pixel of a captured image will vary from the expected location determined from the current estimate of position maintained by the vehicle 200. The error functions 512, 514, 516 represent the probability distributions of the actual pixel location, and preferred embodiments of the present invention apply this information in order to make more effective use of the captured image data, in the presence of positional uncertainty. The manner in which classification is performed and utilised in the presence of positional uncertainty is illustrated in Figure 6. In particular, the benthic surface 202 is overlaid with a virtual grid 602, which divides the total region into a number of sub-regions, each of which is (in the present example) a square or rectangular cell, eg 604. Every pixel of every image captured by the camera 110 corresponds with a particular point on the benthic surface 202, and therefore falls within a specific cell 604. However, due to positional uncertainty, it is statistically probable that the precise cell, eg 604, containing the location corresponding with a pixel is not accurately known. A finite probability exists, for example, that the actual location is in one of the adjacent cells, or even further afield. The associated probability is quantified by the error functions 512, 514, 516.
For each pixel, a match data value, /e an associated probability value, is computed for each material template, as indicated by the boxes 606a, 606b, 606c and so forth in Figure 6. If the location of the vehicle 200 was known precisely, these values could be uniquely assigned to a single cell, corresponding with the pixel location. Data for all pixels, within all images captured by the camera 110, could then be accumulated within appropriate registers associated with each individual cell, eg 604. Specifically, in the preferred embodiments there is maintained, for each cell, a count value and a sum value corresponding with every template. For each pixel falling within a cell, the match probabilities are accumulated within the sum value, and each time this occurs the counter value is incremented. Over a large number of images and pixels, it may be anticipated that the accumulated probability (Ze sum value) corresponding with the predominant material in the cell will increase relative to the sum values associated with other materials, thereby leading to improved material identification.
The accumulated data may alternatively be viewed as a series of layers, eg 608a, 608b, 608c and so forth, each of which represents the probabilities in all cells associated with a particular materia. These "classification layers" are, in effect, virtual "maps", indicating the presence of the corresponding material within corresponding cells. If represented visually, as in Figure 6, they may be interpreted as indicating the probability that the corresponding area on the benthic surface 202 contains the material or object represented by the layer.
In order to account for positional uncertainty, the way in which the count and sum values associated with each layer are maintained is modified appropriately. In particular, since the precise location corresponding with a specific pixel is not accurately known, the associated match data values (ie template probabilities) cannot reliably be accumulated to a single cell. Rather, the values are distributed over all of the cells covered by the error functions eg 512, 514, 516 in proportion to the probability that the pixel lies within each particular cell. As is previously been noted, this probability is readily calculated from the positional error distribution functions. Specifically, both the probability values, and the amount by which the counter value is incremented, are reduced in proportion to the probability that the pixel is actually located within each cell. In this way, the classification data is distributed over the entire area of uncertainty in position, such that the total contribution of each individual pixel is independent of uncertainty in position. That is, consistently with the fact that the total probability that the vehicle 200 is within the entire region is 100%, the total contribution of classification of each individual pixel to the complete cell map is also 100%.
The overall process is illustrated by the flow chart 700 shown in Figure 7. At step 702, an image is captured. Correction of colours in the image, as described previously with reference to Figure 2 and 3, is performed (if required) at step 704.
Then, starting with a first pixel of the image, the pixel data is compared with the various material templates in order to produce match data values (Ze probabilities, or classification data), in step 706. This value must then be distributed over all of the affected cells, in accordance with the positional uncertainty. Specifically, for a given cell an appropriate weighting is computed at step 708, utilising the positional error function. This weighting is then applied to the calculated match data values for each template, and accumulated within the corresponding template registers for the cell in questions, at steps 708 and 710. If all cells have been considered (decision step 712), then steps 708 and 710 are repeated for the next cell in the array. The comparison step 706, and accumulation steps 708, 710, 712 are repeated for all materials (decision step 714). These processes are also repeated for every further pixel within the image (decision step 716).
Once processing of the image is complete, further mapping may be performed by capturing another image, and recommencing the entire process (decision step 718).
It will therefore be appreciated that mapping of an area may be conducted autonomously, such that a very large amount of data may be efficiently collected in a manner which is not labour intensive, and which may therefore be repeated at more regular and frequent intervals then has previously been practical. Due to these statistical methods utilised by embodiments of the invention, all of the captured data may effectively be utilised in the construction of a map comprising an array of template matching data for a range of anticipated material contents, within an array of cells, or sub-regions, within the mapped region. It is possible, in principle, for mapping to be performed at a resolution that is greater than the available positional accuracy of the vehicle. For example, cells having dimensions of as little as 0.5 or one meter may be utilised, even when the best available positional accuracy (eg via GPS) is two meters. If the uncertainty in positioning is genuinely random, and does not exhibit strong correlations over time, multiple passes over a region capturing additional image data will cause the uncertainties to average out, such that the classification data associated with the predominant material within each cell will increase relative to that of other materials that are not present in the cell, or that are present in significantly smaller quantities.
Ultimately, there will be available for each cell an array of accumulated match data values, ie probabilities of classifications, corresponding with the various anticipated materials for which material templates exist. These values may be interpreted variously as the probability that the corresponding area on the benthic surface 202 includes the corresponding material, or the proportion of the surface within the associated cell that is covered by the corresponding material. (The extent to which these interpretations of the statistical data are valid may depend upon the quantity of data gathered, as well as the nature of the various uncertainties.)
The accumulated data may be utilised to generate maps showing the location of specific materials within the overall surveyed area. It is also possible, for each cell, to identify more than one material that may be present. For example, if the accumulated match data values associated with two different material templates in a particular cell are approximately equal, then those two materials may be present with approximately equal probability, or in approximately equal quantity. A list of all possible material present in a cell, along with associated probabilities or proportions may also be generated. In addition, data from groups of cells may be utilised in order to improve classification and mapping. For example, if the data for a particular cell indicates comparable probabilities of the presence, for example, of a variety of seagrass and of sand, and the surrounding cells all indicate a high probability of presence of the seagrass, then it may be reasonable to conclude that the initial cell should be classified as containing seagrass.
In yet another variation, transects may be used as an alternative means of addressing a lack of availability of accurate underwater positional information. A transect involves the submersible vehicle 200 making a substantially straight-line underwater pass, capturing images at regular intervals, along with corresponding time stamps. At the start and end of the transect, the vehicle 200 may surface in order to establish a GPS fix. This enables a straight line interpolation to be performed between the starting and finishing GPS positions, and each of the time stamped images can be assigned an interpellated position along the transect. All of the images may then be post processed, in the manner previously described, in order to develop the classification map. As will be appreciated, however, this method requires the provision of sufficient storage capacity within the computer system 102 of the submersible vehicle 200 to store all of the captured images for processing.
Other variations will also be apparent to persons skilled in the relevant art, falling within the scope of the present invention. For example, in yet another variation, images may be captured and stored, for example along a transect as described above, and saved for later processing. That is, all of the image analysis may be performed at a subsequent time, and need not necessarily be performed in real-time, on board a submersible vehicle. The vehicle may also be provided with a wireless transmitter, such that when surfacing for a positional fix it is also able to transmit stored images and other data to a remote location, such as a computer on board a supervising vessel. Yet another alternative for such communications is a permanent wired connection between a submersible vehicle 200 and a surface vessel, such as a ship, for example via a long, lightweight cable. In this case, it may also be possible for images to be transmitted from the submersible vehicle to a remote computer via the cable in real time, such that the processing load is removed from the computer system of the submersible vehicle. All such variations in locations of data capture, transmission and processing should be understood to fall within the scope of the present invention. In particular, while preferred embodiments of the invention have been described herein, they should not be considered to be limiting of the scope of the invention, which is defined in the claims appended hereto.

Claims

CLAIMS:
1. A method of identifying the presence of materials within a predetermined region, including the steps of: capturing at least one image of a region, the image including a plurality of pixels; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; comparing spectral data of each pixel in the image with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template; and using the match data values used to identify at least one material present within the region.
2. The method of claim 1 wherein each material template includes information representative of a distribution of spectral content of energy received from a sample of a corresponding material under specified conditions.
3. The method of claim 2 wherein the information relates to the spectral content of light reflected from a sample of the material under specified illumination conditions.
4. The method of claim 2 or claim 3 wherein the information is representative of distributions of a plurality of discrete spectral components of energy received from a sample of the material under specified conditions.
5. The method of claim 4 wherein the information includes distributions of intensity values recorded by respective red, green and blue sensors of a trichromatic digital camera directed at a sample of the material under specified visible illumination conditions.
6. The method of claim 4 or claim 5 wherein the information representative of distributions of a plurality of discrete spectral components includes transformed distributions corresponding with parameters derived from said spectral components.
7. The method of any one of claims 2 to 6 wherein the step of comparing spectral data of each pixel in the image includes comparing captured spectral content values of each pixel with the corresponding information representative of one or more distributions of spectral content of energy received from a sample of the corresponding material under specified conditions.
8. The method of any one of the preceding claims including the further step of applying a correction to the captured spectral content values to account for differences between the environmental conditions during the step of capturing, and the specified conditions under which the material template information was obtained.
9. The method of claim 8 wherein the step of applying a correction includes capturing an image of a test pattern having known spectral properties, and correcting the captured image of the region to compensate for differences between the spectral properties of the captured test pattern image and the corresponding known spectral properties.
10. The method of claim 7 wherein the match data value is a probability that the captured spectral content values are drawn from the corresponding distributions of the material template.
11. The method of any one of the proceeding claims wherein the step of using the match data values to identify at least one material present within the region includes identifying a most probable match, following analysis of all pixels of a plurality of images of the region.
12. The method of claim 11 further including outputting the most probable match, an associated confidence level, and/or a list of possible alternatives, along with their associated probabilities.
13. The method of any one of the proceeding claims further including identifying a location of materials within the predetermined region, by the steps of: dividing the region into a plurality of sub-regions; associating each pixel of each captured image with at least one sub- region; and using the match data values to identify at least one material present within each sub-region.
14. The method of claim 13 wherein, for each sub-region and each material template, a corresponding weighted match data value is accumulated for each pixel for each image captured.
15. The method of claim 14 wherein weighted match data values are accumulated for each sub-region in accordance with a probability that a corresponding pixel represents a point within the sub-region.
16. The method of claim 15 wherein the distribution of the probability that the corresponding pixel represents a point where the sub-region is determined in accordance with a precision within which the pixel location is known.
17. The method of any one of the preceding claims further including the step of generating one or more maps showing a graphical representation of the presence and/or location of material with the region.
18. A method of identifying the location of materials within a predetermined region, including the steps of: capturing at least one image covering at least a part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; for each pixel of the image, identifying at least one material corresponding with properties of the pixel and generating, for each one of a plurality of sub- regions of the predetermined region, a location data value indicative of a probability that said at least one material is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the predetermined precision within which the location of the image is known; and using the location data values to identify the presence of said at least one material at a location corresponding with at least one of the plurality of sub- regions of the predetermined region.
19. The method of claim 18 wherein the location data values are weighting values corresponding with a probability at the corresponding pixel represents a point within each relevant sub-region.
20. The method of claim 18 or claim 19 wherein the step of identifying at least one material corresponding with properties of the pixel includes comparing spectral data of the pixel with each one of a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material, in order to generate a corresponding plurality of match data values, wherein each said data value is indicative of a probability that the pixel is representative of a material matching the corresponding template.
21. A method of identifying the presence and location of materials within a predetermined region, including the steps of: capturing at least one image covering at least part of the predetermined region, the image including a plurality of pixels, and wherein the location of the image within the region is known to within a predetermined precision; providing a plurality of material templates, each said template including spectral information which is characteristic of a corresponding material; for each pixel of the image; comparing spectral data of the pixel with each one of the plurality of material templates, to generate a corresponding plurality of match data values, wherein each said match data value is indicative of a probability that the pixel is representative of a material matching the corresponding template, and generating, for each one of a plurality of sub-regions of the predetermined region, a location data value indicative of the probability that a material corresponding with the pixel is present within the sub-region, wherein said location data value is based upon an estimated distance between the pixel and the sub-region, and the predetermined precision within which the location of the image is known; and combining the location data values and the match data values to identify at least one material present within the predetermined region, at a location corresponding with at least one of the plurality of sub-regions.
22, A computer implemented apparatus for identifying the presence and/or location of materials with a predetermined region, the apparatus including: at least one processor; a digital imaging device, operatively coupled to the processor, configured to capture digital images of portions of the predetermined region, each such captured image including a plurality of pixels; and at least one memory device operatively coupled to the processor for storage and processing of captured images, the at least one memory device further including computer executable computer instructions which, when executed by the processor, cause the processor to implement a method in accordance with any one of claims 1 to 20.
23. The apparatus of claim 22 which is incorporated within a submersible vehicle for autonomous mapping and analysis of benthic habitat.
PCT/AU2009/000926 2008-07-21 2009-07-21 Method and system for autonomous habitat analysis WO2010009498A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2008903721A AU2008903721A0 (en) 2008-07-21 Method and System for Autonomous Habitat Analysis
AU2008903721 2008-07-21

Publications (1)

Publication Number Publication Date
WO2010009498A1 true WO2010009498A1 (en) 2010-01-28

Family

ID=41569922

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2009/000926 WO2010009498A1 (en) 2008-07-21 2009-07-21 Method and system for autonomous habitat analysis

Country Status (1)

Country Link
WO (1) WO2010009498A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111414790A (en) * 2019-01-08 2020-07-14 丰田自动车株式会社 Information processing apparatus, information processing system, program, and information processing method
EP3580547B1 (en) * 2017-02-10 2023-04-05 VoxelGrid GmbH Device and method for analyzing objects

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5329595A (en) * 1992-06-05 1994-07-12 Trw Inc. System and method for analyzing optical spectral data of a terrain image
JP2000162150A (en) * 1998-11-26 2000-06-16 Dainippon Printing Co Ltd Defect inspecting method and device for metal sample surface
US20030053686A1 (en) * 2001-09-13 2003-03-20 Eastman Kodak Company Method for detecting subject matter regions in images
US20030072470A1 (en) * 2001-10-15 2003-04-17 Lee Henry C. Two dimensional autonomous isotropic detection technique
WO2006133473A1 (en) * 2005-06-15 2006-12-21 Tissue Gnostics Gmbh Method for the segmentation of leukocytes

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5329595A (en) * 1992-06-05 1994-07-12 Trw Inc. System and method for analyzing optical spectral data of a terrain image
JP2000162150A (en) * 1998-11-26 2000-06-16 Dainippon Printing Co Ltd Defect inspecting method and device for metal sample surface
US20030053686A1 (en) * 2001-09-13 2003-03-20 Eastman Kodak Company Method for detecting subject matter regions in images
US20030072470A1 (en) * 2001-10-15 2003-04-17 Lee Henry C. Two dimensional autonomous isotropic detection technique
WO2006133473A1 (en) * 2005-06-15 2006-12-21 Tissue Gnostics Gmbh Method for the segmentation of leukocytes

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3580547B1 (en) * 2017-02-10 2023-04-05 VoxelGrid GmbH Device and method for analyzing objects
CN111414790A (en) * 2019-01-08 2020-07-14 丰田自动车株式会社 Information processing apparatus, information processing system, program, and information processing method
CN111414790B (en) * 2019-01-08 2023-07-18 丰田自动车株式会社 Information processing apparatus, information processing system, program, and information processing method

Similar Documents

Publication Publication Date Title
Verbyla Satellite remote sensing of natural resources
Batur et al. Assessment of surface water quality by using satellite images fusion based on PCA method in the Lake Gala, Turkey
Purkis A" Reef-Up" approach to classifying coral habitats from IKONOS imagery
Kamal et al. Assessment of multi-resolution image data for mangrove leaf area index mapping
Legleiter et al. Spectrally based remote sensing of river bathymetry
Adade et al. Unmanned Aerial Vehicle (UAV) applications in coastal zone management—A review
Bramante et al. Multispectral derivation of bathymetry in Singapore's shallow, turbid waters
Mishra et al. High-resolution ocean color remote sensing of benthic habitats: a case study at the Roatan Island, Honduras
Lemenkova Seagrass mapping and monitoring along the coasts of Crete, Greece
Ouma et al. Estimation of reservoir bio-optical water quality parameters using smartphone sensor apps and Landsat ETM+: review and comparative experimental results
Loisel et al. Challenges and new advances in ocean color remote sensing of coastal waters
Allan et al. Empirical and semi-analytical chlorophyll a algorithms for multi-temporal monitoring of New Zealand lakes using Landsat
Qing et al. Improving remote sensing retrieval of water clarity in complex coastal and inland waters with modified absorption estimation and optical water classification using Sentinel-2 MSI
Teodoro Optical satellite remote sensing of the coastal zone environment—An overview
Chen et al. A simple atmospheric correction algorithm for MODIS in shallow turbid waters: A case study in Taihu Lake
Amato et al. Experimental approach to the selection of the components in the minimum noise fraction
Gonçalves et al. Exploring RPAS potentiality using a RGB camera to understand short term variation on sandy beaches
Mandlburger A review of active and passive optical methods in hydrography
Ozkan et al. The influence of window size on remote sensing-based prediction of forest structural variables
Pimentel et al. Terrestrial photography as an alternative to satellite images to study snow cover evolution at hillslope scale
Pisanti et al. Sea water turbidity analysis from Sentinel-2 images: atmospheric correction and bands correlation
WO2010009498A1 (en) Method and system for autonomous habitat analysis
Khanna et al. Automated Secchi disk depth measurement based on artificial intelligence object recognition
Islam et al. Evaluation of satellite remote sensing for operational monitoring of sediment plumes produced by dredging at Hay Point, Queensland, Australia
Davie et al. Benthic habitat mapping with autonomous underwater vehicles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09799856

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09799856

Country of ref document: EP

Kind code of ref document: A1