CN113156420A - Oil spill detection system and method - Google Patents

Oil spill detection system and method Download PDF

Info

Publication number
CN113156420A
CN113156420A CN202110268794.XA CN202110268794A CN113156420A CN 113156420 A CN113156420 A CN 113156420A CN 202110268794 A CN202110268794 A CN 202110268794A CN 113156420 A CN113156420 A CN 113156420A
Authority
CN
China
Prior art keywords
image
sar
hyperspectral
feature
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110268794.XA
Other languages
Chinese (zh)
Inventor
李忠伟
罗偲
马毅
任鹏
任广波
何乐
郭防铭
辛紫麒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Petroleum East China
Original Assignee
China University of Petroleum East China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Petroleum East China filed Critical China University of Petroleum East China
Priority to CN202110268794.XA priority Critical patent/CN113156420A/en
Publication of CN113156420A publication Critical patent/CN113156420A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques

Abstract

The application provides an oil spill detection system and method. The system comprises an unmanned aerial vehicle and a ground control end; the unmanned aerial vehicle is provided with an infrared device, a Synthetic Aperture Radar (SAR) device, a hyperspectral device and a processing unit; the unmanned aerial vehicle flies once to carry out data fusion on three paths of data with different characteristics of the same scene of acquisition to through the communication connection between unmanned aerial vehicle and the ground control end, transmit and fuse the image, the ground control end carries out the analysis to the fusion image of receiving in real time, in order to analyze out the oil spilling kind, oil spilling volume and oil film thickness. The oil spilling detection precision, the application effect and the use value can be improved.

Description

Oil spill detection system and method
Technical Field
The application relates to the field of ocean oil spill detection, in particular to an oil spill detection system and method.
Background
In today's society, petroleum remains a very important resource. With the increasing scarcity of land resources and the rapid increase in human demand for energy, the marine oil industry and the marine oil transportation industry are developing vigorously. The offshore oil spill is loss of oil in different degrees in the process of offshore exploitation or transportation, and mainly comprises oil well crude oil leakage caused in the process of offshore oil exploration and development, leakage caused by loading and unloading of offshore oil pipelines or oil tankers, crude oil leakage caused by accidents of collision, overturning, grounding and the like of ships, even oil spill caused by natural disasters and the like. These accidents all pollute the marine ecological environment to varying degrees and also cause a great deal of economic loss. In order to reduce the occurrence of oil spill accidents, the monitoring and detection of offshore oil spill needs to be enhanced.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first objective of the present application is to provide an oil spill detection system to realize monitoring and detection of oil spill at sea, which can improve oil spill detection accuracy, application effect and use value.
A second object of the present application is to provide an oil spill detection method.
To achieve the above object, an embodiment of a first aspect of the present application provides an oil spill detection system, including: an unmanned aerial vehicle and a ground control terminal, wherein,
the unmanned aerial vehicle comprises an infrared device, an SAR (Synthetic Aperture Radar) device, a hyperspectral device and a processing unit; wherein the content of the first and second substances,
the infrared device is used for acquiring infrared spectrum information of oil spilling to be detected and first GPS information of the oil spilling to be detected, fusing the infrared spectrum information and the first GPS information to obtain an infrared spectrum fused image, and sending the infrared spectrum fused image to the processing unit;
the SAR device is used for acquiring an SAR image aiming at the oil spill to be detected and corresponding second GPS information, fusing the SAR image and the corresponding second GPS information to obtain an SAR fusion image, and sending the SAR fusion image to the processing unit;
the hyperspectral device is used for acquiring hyperspectral data and GPS information of the sea surface, imaging the acquired hyperspectral data to obtain an image of a sea surface monitoring area, fusing the image of the sea surface monitoring area and the acquired GPS information to obtain a fused image of the sea surface monitoring area, and sending the fused image of the sea surface monitoring area to the processing unit;
the processing unit is used for carrying out data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device, the SAR fusion image sent by the SAR device and the sea surface monitoring area fusion image sent by the hyperspectral device so as to obtain a fusion image, and sending the fusion image to the ground control end;
and the ground control terminal is used for analyzing the fusion image received in real time to obtain the type of oil overflow, the oil overflow amount and the thickness of the oil film.
In order to achieve the above object, a second embodiment of the present application provides an oil spill detection method, including:
acquiring an infrared spectrum fusion image sent by an infrared device; the infrared spectrum fusion image comprises infrared spectrum information of the oil spill to be detected and corresponding first GPS information;
acquiring an SAR fusion image sent by an SAR device; the SAR fusion image comprises the SAR image of the oil spill to be detected and corresponding second GPS information;
acquiring a sea surface monitoring area fusion image sent by a hyperspectral device; the sea surface monitoring area fusion image comprises a sea surface monitoring area image and GPS information;
performing data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device, the SAR fusion image sent by the SAR device and the sea surface monitoring area fusion image sent by the hyperspectral device to obtain a fusion image;
and sending the fused image to a ground control end so that the ground control end analyzes the fused image received in real time to obtain the type of oil spill, the amount of oil spill and the thickness of an oil film.
According to the oil spill detection system and the oil spill detection method, the infrared device, the SAR device and the hyperspectral device are loaded on the unmanned aerial vehicle at the same time, the three devices belong to an information acquisition system, and the unmanned aerial vehicle flies once to obtain three paths of data with different characteristics in the same scene and transmits the three paths of data to the processing unit of the unmanned aerial vehicle. The processing unit carries out data fusion on the obtained data, and transmits the detection image through the communication connection between the unmanned aerial vehicle and the ground control end. And the ground control end analyzes the fusion image received in real time to analyze the type of the oil spill, the oil spill amount and the thickness of the oil film. Therefore, the oil spilling image data are collected through the multiple sensors, the image data with different characteristics are combined, the advantages of the oil spilling image data and the image data with different characteristics can be exerted by mutually taking the advantages and making up the disadvantages, the target characteristics are more comprehensively reflected, and stronger information interpretation capability and reliable analysis results are provided. The application range of each image data source is expanded, and the oil spill detection precision, the application effect and the use value are improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a block diagram of an oil spill detection system according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of a hyperspectral device according to an embodiment of the application;
FIG. 3 is an exemplary diagram of an oil spill detection system according to an embodiment of the present application;
FIG. 4 is a flow chart of a method for detecting oil spill according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a data fusion process of the oil spill detection system according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of obtaining a fusion feature by calculating a channel attention weight and a spatial attention weight corresponding to different features according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application, and should not be construed as limiting the present application.
The oil spill detection system and method of embodiments of the present application are described below with reference to the drawings.
Fig. 1 is a block diagram of an oil spill detection system according to an embodiment of the present application. As shown in fig. 1, the oil spill detection system 10 may include: a drone 100 and a ground control terminal 200. Wherein, as shown in fig. 1, the drone 100 may include: infrared device 110, synthetic aperture radar SAR device 120, high spectrum device 130 and processing unit 140.
The infrared device 110 is configured to collect infrared spectrum information of the oil spill to be detected and first GPS information of the oil spill to be detected, fuse the infrared spectrum information and the first GPS information to obtain an infrared spectrum fusion image, and send the infrared spectrum fusion image to the processing unit 140.
In some embodiments of the present application, the infrared device 110 may further be configured to select a plurality of different main absorption peak positions with the strongest absorption intensity from the infrared spectrum information of the oil spill to be detected, as characteristic absorption peaks for determining the type of the oil spill to be detected, and obtain the oil type of the oil spill to be detected from a pre-established spectrum information sample library based on the plurality of different main absorption peak positions.
The SAR device 120 is configured to acquire an SAR image for the oil spill to be detected and corresponding second GPS information, fuse the SAR image and the corresponding second GPS information to obtain an SAR fusion image, and send the SAR fusion image to the processing unit 140.
In the embodiment of the present application, the SAR device 120 sequentially performs average filtering processing and maximum entropy threshold processing on the acquired SAR image to segment a dark sea area and an oil spill dark spot area from the SAR image, removes the dark spot area in the SAR image by using morphology, fuses the SAR image from which the dark spot area is removed and GPS information to obtain an SAR fusion image, and sends the SAR fusion image to the processing unit 140.
The hyperspectral device 130 is configured to collect hyperspectral data and GPS information from the sea surface, image the collected hyperspectral data to obtain a sea surface monitoring area image, fuse the sea surface monitoring area image with the collected GPS information to obtain a sea surface monitoring area fusion image, and send the sea surface monitoring area fusion image to the processing unit 140.
As an example, as shown in FIG. 2, the hyperspectral device 130 may include: the hyperspectral imager 131, the GPS assisted inertial navigation 132 and the data processing module 133. The hyperspectral imager 131 is used for acquiring hyperspectral data of the sea surface; the GPS-assisted inertial navigation system 132 is configured to record the flight attitude and the flight latitude and longitude position information of the unmanned aerial vehicle, and synchronously transmit the recorded flight attitude and position information and hyperspectral data acquired by the hyperspectral imager to the data processing module 133; the data processing module 133 is configured to perform spectrum calibration and radiation correction on data transmitted by the hyperspectral imager to obtain a radiation-corrected sea surface monitoring image, perform geometric correction on the sea surface monitoring image according to the acquired synchronous flight longitude and latitude position information and flight attitude parameters to obtain a geometrically-corrected sea surface monitoring image, and splice the geometrically-corrected continuous multiframe hyperspectral image data to obtain a large-area preprocessed sea surface monitoring area image.
In the embodiment of the present application, the hyperspectral imager 131 may employ a fourier infrared spectrometer.
The processing unit 140 is configured to perform data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device 110, the SAR fusion image sent by the SAR device 120, and the sea surface monitoring area fusion image sent by the hyperspectral device 130 to obtain a fusion image, and send the obtained fusion image to the ground control terminal 200.
Optionally, in this embodiment of the present application, the processing unit 140 may adopt an attention fusion method to perform data fusion on three paths of data with different characteristics in the same scene to obtain a fused image. Among them, the attention fusion method uses a symmetric encoder-decoder. As shown in fig. 5, the encoder of the processing unit 140 extracts the intermediate features and the compensation features from the three images of the infrared spectrum fused image, the SAR fused image, and the sea surface monitoring region fused image, and these features are generated from the input image by the three convolutional layers, respectively. With the introduction of an attention mechanism, two attention maps derived from the intermediate feature are multiplied by the intermediate feature for fusion. The significant compensation features obtained by the element-by-element selection will be passed to the corresponding deconvolution layer for processing. Finally, the fused intermediate features and the compensation features are decoded to reconstruct the fused image.
As an example, the specific implementation process of the processing unit 140 performing data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device 110, the SAR fusion image sent by the SAR device 120, and the sea surface monitoring area fusion image sent by the hyperspectral device 130 to obtain the fusion image may be as follows:
1) extracting infrared features and first compensation features in the infrared spectrum fusion image;
2) extracting SAR features and second compensation features in the SAR fusion image;
3) extracting hyperspectral features and third compensation features in the sea surface monitoring area fusion image;
4) determining a first channel attention weight corresponding to the infrared features, a second channel attention weight corresponding to the SAR features and a third channel attention weight corresponding to the hyperspectral features;
in the embodiment of the present application, the processing unit 140 may obtain the channel attention weight by the following formula:
Figure BDA0002973242830000061
wherein the content of the first and second substances,
Figure BDA0002973242830000062
Xirepresents the intermediate features, i ∈ {1,2, 3}, X1Indicating the infrared characteristic, X2Representing SAR characteristics, X3Representing a hyperspectral feature, n representing the corresponding channel index in the intermediate feature Xi, P () representing a global pooling operator, α1Denotes the first channel attention weight, α2Representing the attention weight of the second channel, α3Indicating the third channel attention weight.
For example, as shown in FIG. 6, the initial weighting vector is calculated by a global pooling operator
Figure BDA0002973242830000071
Figure BDA0002973242830000072
Choose to use the softmax operator (as in equation 1 above) to obtain the final channel attention weight αi
5) Generating a channel attention feature according to the infrared feature, the first channel attention weight, the SAR feature, the second channel attention weight, the hyperspectral feature and the third channel attention weight;
in the embodiment of the application, the infrared feature, the first channel attention weight, the SAR feature, the second channel attention weight, the hyperspectral feature and the third channel attention weight may be weighted and summed to obtain the channel attention feature y1
6) Determining a first spatial attention weight corresponding to the infrared features, a second spatial attention weight corresponding to the SAR features, and a third spatial attention weight corresponding to the hyperspectral features;
in the embodiment of the present application, the processing unit 140 may obtain the spatial attention weight by the following formula:
Figure BDA0002973242830000073
wherein | | | purple hair1Is represented by1Norm, (X, y) representing the intermediate feature XiAnd spatial attention weight βiEach position representing a C-dimensional vector in the intermediate feature, Xi(x, y) denotes a vector having C dimensions. That is, the spatial attention weight βiCan be prepared from1Norm and soft-max operator from the intermediate feature XiAnd calculating to obtain.
7) Generating a spatial attention feature according to the infrared feature, the first spatial attention weight, the SAR feature, the second spatial attention weight, the hyperspectral feature and the third spatial attention weight;
in this embodiment of the present application, the infrared feature, the first spatial attention weight, the SAR feature, the second spatial attention weight, the hyperspectral feature, and the third spatial attention weight may be weighted and summed to obtain a spatial attention feature y2
8) The channel attention feature, the spatial attention feature, the first compensation feature, the second compensation feature, and the third compensation feature are decoded to reconstruct a fused image.
In the embodiment of the present application, the channel attention feature y can be set1And spatial attention feature y2And summing to obtain a fusion feature y, summing the fusion feature y with the first compensation feature, the second compensation feature and the third compensation feature, and decoding the obtained features to reconstruct the fusion image.
The ground control end 200 is used for analyzing the fusion image received in real time to obtain the type of oil overflow, the amount of oil overflow and the thickness of the oil film.
It should be noted that, in order to enable the unmanned aerial vehicle to perform data fusion on three paths of data with different characteristics in the same scene, a GPS antenna and an auxiliary inertial navigation unit may be provided on the unmanned aerial vehicle. Therefore, when the infrared device, the SAR device and the hyperspectral device send the images acquired by the infrared device, the SAR device and the hyperspectral device to the processing unit, the GPS information is sent to the processing unit, and therefore the processing unit carries out data fusion on acquired data sent by three acquisition devices with different characteristics in the same scene based on the GPS information sent by each acquisition device. In order to facilitate a clearer understanding of the present application by those skilled in the art, a detailed description will be given below with reference to fig. 3.
As shown in fig. 3, the drone may include: the system comprises an infrared device, an SAR device, a hyperspectral device and a data storage and processing unit; the drone may also include a controller. The infrared device comprises a data acquisition module, a data storage and primary processing module and a data output module. The data acquisition module is used for acquiring infrared spectrum information of different types of spilled oil; calibrating spectral information from different types of oil spills through an acquisition system; establishing spectral information sample libraries of different types of spilled oil due to different absorption peak positions of the main characteristic infrared spectrum; collecting infrared spectrum information of the oil spill to be detected, comparing the established spectrum information sample library, and identifying and outputting the oil type of the oil spill to be detected; and selecting 7 different main absorption peak positions with strongest absorption intensity for the oil spill to be detected as characteristic absorption peaks for judging the type of the oil spill to be detected. And meanwhile, the data output module is connected with the unmanned aerial vehicle processing unit, and the acquired infrared data is primarily processed and then transmitted to the unmanned aerial vehicle processing unit with GPS information for standby. The thermal infrared imager has the following main technical indexes: the working wavelength is as follows: 3-5 μm; resolution of the sensor: 640X 512; visual field: a dual field of view.
In this example, the hyperspectral apparatus includes: the system comprises a hyperspectral imager, GPS auxiliary inertial navigation, a data storage processing module and a data output module. The hyperspectral imager adopts a Fourier infrared spectrometer, and hyperspectral data are collected and imaged; the GPS auxiliary inertial navigation system is connected with the data acquisition control equipment in the module through an RS232 serial port, records the flight attitude and the position information of the airplane, and synchronously transmits the flight attitude and the position information and the spectrum data to the data processing module for subsequent spectrum image data processing; the data processing module is used for firstly carrying out spectrum calibration and radiation correction to obtain a radiation-corrected sea surface monitoring image, then carrying out geometric correction on the image according to the obtained synchronous flight longitude and latitude position information and flight attitude parameters to obtain a geometrically-corrected sea surface monitoring image, and splicing continuous multi-frame hyperspectral image data after geometric correction to obtain a large-area preprocessed sea surface monitoring area image. The hyperspectral imager has the following main technical indexes: wavelength range: 470-900 nm; width of the channel: 5-10 nm; the number of channels: 160; the specification of the detector is as follows: si CMOS; digital resolution: 10bit/8 bit; high spectral imaging speed: 30 Cubes/s; cube resolution: 2048 × 1088 pixels; ground resolution: 9cm 180 m.
The SAR device includes: the system comprises an SAR data acquisition module, a data processing module, a GPS auxiliary inertial navigation module and an image output module. The SAR image is acquired by the SAR data acquisition module and input to the data processing module. Firstly, carrying out mean filtering treatment once, then carrying out maximum entropy threshold treatment once, dividing dark sea and part of oil spilling dark spots, and removing the dark spots by using morphology; and then the SAR image carrying the GPS information and processed is transmitted to a processing unit of the unmanned aerial vehicle. The SAR imaging radar comprises the following main technical indexes: imaging distance: more than or equal to 15 km; signal bandwidth: the frequency of 10 MHz-300 MHz is continuously adjustable; imaging width: 4km (swath), 10km (scan), 2km × 2km (bunch); imaging resolution ratio: 2m × 2m (swath), 5m × 5m (scan), 1m × 1m (bunch).
The data storage and processing unit mainly completes an image fusion part, and image registration of the three modules is performed before fusion, wherein image registration based on SIFT feature points is adopted in the application. Image fusion of the characteristic layer is adopted, and airborne processing can be achieved due to information compression. The present application employs a symmetric encoder-decoder fusion image of an attention fusion method, for example, an encoder extracts intermediate features and compensation features from images transmitted by three devices, which are generated from input images by three convolutional layers, respectively. With the introduction of an attention mechanism, two attention maps derived from the intermediate feature are multiplied by the intermediate feature to fuse. The significant compensation features obtained by the element-by-element selection will be passed to the corresponding deconvolution layer for processing. Finally, the fused intermediate features and the selected compensation features are decoded to reconstruct the fused image.
Optionally, in this embodiment of the application, when the ground control end receives the fused image sent by the controller of the unmanned aerial vehicle, the ground control end may display the image received in real time through the display, so that a ground person may intuitively know a picture of a sea surface oil spill through the display.
In conclusion, the method and the device have the advantages that aiming at the problems that the SAR oil spill detection false alarm rate is high and a single sensor cannot acquire the oil film thickness information in the full range, the unmanned aerial vehicle airborne hyperspectral, SAR and thermal infrared integrated system oil spill observation experiments are developed; acquiring and analyzing the characteristic differences of hyperspectral, thermal infrared and SAR of an oil film, an oil-like film and a clean sea surface, constructing a sensitive characteristic combination for distinguishing the oil film, the oil-like film and the clean sea surface, and developing a low false alarm rate oil film joint detection model based on the hyperspectral, thermal infrared and SAR of the unmanned aerial vehicle; the method comprises the steps of obtaining and analyzing differences of oil films with different thicknesses in the aspects of visible near-infrared hyperspectral reflectivity, thermal infrared radiation brightness temperature, image texture and the like, extracting characteristic wave bands and spectral characteristics related to the oil film thickness, respectively constructing a thin oil film thickness inversion method based on hyperspectral data and a thick oil film thickness inversion method based on thermal infrared data, and realizing oil film thickness extraction.
In order to implement the above embodiment, the present application further provides an oil spilling detection method.
Fig. 4 is a flowchart of an oil spill detection method according to an embodiment of the present application. As shown in fig. 4, the oil spill detection method includes the following steps.
In step 401, an infrared spectrum fused image transmitted by an infrared device is acquired.
The infrared spectrum fusion image comprises infrared spectrum information of the oil spill to be detected and corresponding first GPS information. In this embodiment, the infrared device may further select a plurality of different main absorption peak positions with the strongest absorption intensity from the infrared spectrum information of the oil spill to be detected, as characteristic absorption peaks for determining the type of the oil spill to be detected, and obtain the oil type of the oil spill to be detected from a pre-established spectrum information sample library based on the plurality of different main absorption peak positions.
In step 402, a SAR fusion image transmitted by a SAR device is acquired.
In the embodiment of the application, the SAR fusion image comprises an SAR image of the oil spill to be detected and corresponding second GPS information.
In step 403, a sea surface monitoring area fusion image sent by the hyperspectral device is obtained.
In the embodiment of the present application, the sea surface monitoring region fused image includes a sea surface monitoring region image and GPS information.
In step 404, data fusion is performed on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device, the SAR fusion image sent by the SAR device, and the sea surface monitoring area fusion image sent by the hyperspectral device, so as to obtain a fusion image.
In the embodiment of the application, the infrared feature and the first compensation feature in the infrared spectrum fusion image can be extracted; extracting SAR features and second compensation features in the SAR fusion image; extracting hyperspectral features and third compensation features in the sea surface monitoring area fusion image; determining a first channel attention weight corresponding to the infrared features, a second channel attention weight corresponding to the SAR features, and a third channel attention weight corresponding to the hyperspectral features; generating a channel attention feature according to the infrared feature, the first channel attention weight, the SAR feature, the second channel attention weight, the hyperspectral feature and the third channel attention weight; determining a first spatial attention weight corresponding to the infrared features, a second spatial attention weight corresponding to the SAR features, and a third spatial attention weight corresponding to the hyperspectral features; generating a spatial attention feature according to the infrared feature, the first spatial attention weight, the SAR feature, the second spatial attention weight, the hyperspectral feature and the third spatial attention weight; the channel attention feature, the spatial attention feature, the first compensation feature, the second compensation feature, and the third compensation feature are decoded to reconstruct a fused image.
As an example, the channel attention weight may be obtained by the following formula:
Figure BDA0002973242830000121
wherein the content of the first and second substances,
Figure BDA0002973242830000122
Xirepresents the intermediate features, i ∈ {1,2, 3}, X1Indicating the infrared characteristic, X2Representing SAR characteristics, X3Representing a hyperspectral feature, n representing the corresponding channel index in the intermediate feature Xi, P () representing a global pooling operator, α1Denotes the first channel attention weight, α2Representing the attention weight of the second channel, α3Indicating the third channel attention weight.
In the present embodiment, the spatial attention weight can be obtained by the following formula:
Figure BDA0002973242830000123
wherein | | | purple hair1Is represented by1Norm, (X, y) representing the intermediate feature XiAnd spatial attention weight βiEach position representing a C-dimensional vector in the intermediate feature, Xi(x, y) denotes a vector having C dimensions.
In step 405, the obtained fusion image is sent to the ground control end, so that the ground control end analyzes the fusion image received in real time to obtain the type of oil spill, the amount of oil spill and the thickness of the oil film.
It should be noted that the foregoing explanation of the embodiment of the oil spilling detection system is also applicable to the oil spilling detection method of the embodiment, and is not repeated herein.
According to the oil spill detection method, the infrared device, the SAR device and the hyperspectral device are loaded on the unmanned aerial vehicle at the same time, the three devices belong to an information acquisition system, and the unmanned aerial vehicle flies once to obtain three paths of data with different characteristics in the same scene and transmits the three paths of data to the processing unit of the unmanned aerial vehicle. The processing unit carries out data fusion on the obtained data, and transmits the detection image through the communication connection between the unmanned aerial vehicle and the ground control end. And the ground control end analyzes the fusion image received in real time to analyze the type of the oil spill, the oil spill amount and the thickness of the oil film. Therefore, the oil spilling image data are collected through the multiple sensors, the image data with various different characteristics are combined, the advantages of the oil spilling image data and the image data with various different characteristics can be exerted by mutually taking the advantages and making up the disadvantages, the target characteristics are more comprehensively reflected, and stronger information interpretation capability and reliable analysis results are provided. The application range of each image data source is expanded, and the oil spill detection precision, the application effect and the use value are improved.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Further, in the description of the present application, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of implementing the embodiments of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the computer readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. While embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations in the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.

Claims (8)

1. An oil spill detection system, comprising: an unmanned aerial vehicle and a ground control terminal, wherein,
the unmanned aerial vehicle comprises an infrared device, a Synthetic Aperture Radar (SAR) device, a hyperspectral device and a processing unit; wherein the content of the first and second substances,
the infrared device is used for acquiring infrared spectrum information of oil spilling to be detected and first GPS information of the oil spilling to be detected, fusing the infrared spectrum information and the first GPS information to obtain an infrared spectrum fused image, and sending the infrared spectrum fused image to the processing unit;
the SAR device is used for acquiring an SAR image aiming at the oil spill to be detected and corresponding second GPS information, fusing the SAR image and the corresponding second GPS information to obtain an SAR fusion image, and sending the SAR fusion image to the processing unit;
the hyperspectral device is used for acquiring hyperspectral data and GPS information of the sea surface, imaging the acquired hyperspectral data to obtain a sea surface monitoring area image, fusing the sea surface monitoring area image and the acquired GPS information to obtain a sea surface monitoring area fusion image, and sending the sea surface monitoring area fusion image to the processing unit;
the processing unit is used for carrying out data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device, the SAR fusion image sent by the SAR device and the sea surface monitoring area fusion image sent by the hyperspectral device so as to obtain a fusion image, and sending the fusion image to the ground control terminal;
and the ground control terminal is used for analyzing the fusion image received in real time to obtain the type of oil spill, the amount of oil spill and the thickness of an oil film.
2. The system according to claim 1, wherein the infrared device is further configured to select a plurality of different main absorption peak positions with the strongest absorption intensity from the infrared spectrum information of the oil spill to be detected, as characteristic absorption peaks for determining the type of the oil spill to be detected, and obtain the oil type of the oil spill to be detected from a pre-established spectrum information sample library based on the plurality of different main absorption peak positions.
3. The system of claim 1, wherein the hyperspectral apparatus comprises:
the hyperspectral imager is used for acquiring hyperspectral data of the sea surface;
the GPS auxiliary inertial navigation is used for recording the flight attitude and the flight longitude and latitude position information of the unmanned aerial vehicle and synchronously transmitting the recorded flight attitude and position information and hyperspectral data acquired by the hyperspectral imager to the data processing module;
the data processing module is used for carrying out spectrum calibration and radiation correction on the data transmitted by the hyperspectral imager to obtain a sea surface monitoring image after radiation correction, carrying out geometric correction on the sea surface monitoring image according to the obtained synchronous flight longitude and latitude position information and flight attitude parameters to obtain a sea surface monitoring image after geometric correction, and splicing continuous multiframe hyperspectral image data after geometric correction to obtain a large-area preprocessed sea surface monitoring area image.
4. The system of claim 3, wherein the hyperspectral imager is a Fourier infrared spectrometer.
5. The system of claim 1, wherein the processing unit is specifically configured to:
extracting infrared features and first compensation features in the infrared spectrum fusion image;
extracting SAR features and second compensation features in the SAR fusion image;
extracting hyperspectral features and third compensation features in the sea surface monitoring area fusion image;
determining a first channel attention weight corresponding to the infrared feature, a second channel attention weight corresponding to the SAR feature, and a third channel attention weight corresponding to the hyperspectral feature;
generating a channel attention feature according to the infrared feature, the first channel attention weight, the SAR feature, the second channel attention weight, the hyperspectral feature and the third channel attention weight;
determining a first spatial attention weight corresponding to the infrared feature, a second spatial attention weight corresponding to the SAR feature, and a third spatial attention weight corresponding to the hyperspectral feature;
generating a spatial attention feature according to the infrared feature, the first spatial attention weight, the SAR feature, the second spatial attention weight, the hyperspectral feature and the third spatial attention weight;
decoding the channel attention feature, the spatial attention feature, the first compensation feature, the second compensation feature, and the third compensation feature to reconstruct the fused image.
6. The system of claim 5, wherein the processing unit obtains a channel attention weight by the following equation:
Figure FDA0002973242820000031
wherein the content of the first and second substances,
Figure FDA0002973242820000032
Xirepresents the intermediate features, i ∈ {1,2, 3}, X1Representing said infrared characteristic, X2Representing the SAR characteristic, X3Representing the hyperspectral features, n representing the corresponding channel index in the intermediate features Xi, P () representing a global pooling operator, α1Representing the first channel attention weight, α2Representing the attention weight, a, of the second channel3Representing the third channel attention weight.
7. The system of claim 5, wherein the processing unit obtains the spatial attention weight by:
Figure FDA0002973242820000033
wherein | | | purple hair1Is represented by1Norm, (X, y) representing the intermediate feature XiAnd spatial attention weight βiEach position representing a C-dimensional vector in the intermediate feature, Xi(x, y) denotes a vector having C dimensions.
8. An oil spill detection method, comprising:
acquiring an infrared spectrum fusion image sent by an infrared device; the infrared spectrum fusion image comprises infrared spectrum information of the oil spill to be detected and corresponding first GPS information;
acquiring an SAR fusion image sent by an SAR device; the SAR fusion image comprises the SAR image of the oil spill to be detected and corresponding second GPS information;
acquiring a sea surface monitoring area fusion image sent by a hyperspectral device; the sea surface monitoring area fusion image comprises a sea surface monitoring area image and GPS information;
performing data fusion on three paths of data with different characteristics in the same scene based on the infrared spectrum fusion image sent by the infrared device, the SAR fusion image sent by the SAR device and the sea surface monitoring area fusion image sent by the hyperspectral device to obtain a fusion image;
and sending the fused image to a ground control end so that the ground control end analyzes the fused image received in real time to obtain the type of oil spill, the amount of oil spill and the thickness of an oil film.
CN202110268794.XA 2021-03-12 2021-03-12 Oil spill detection system and method Pending CN113156420A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110268794.XA CN113156420A (en) 2021-03-12 2021-03-12 Oil spill detection system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110268794.XA CN113156420A (en) 2021-03-12 2021-03-12 Oil spill detection system and method

Publications (1)

Publication Number Publication Date
CN113156420A true CN113156420A (en) 2021-07-23

Family

ID=76887334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110268794.XA Pending CN113156420A (en) 2021-03-12 2021-03-12 Oil spill detection system and method

Country Status (1)

Country Link
CN (1) CN113156420A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2621668A (en) * 2022-06-02 2024-02-21 Univ Zhejiang System and method for acquiring hyperspectral image on the basis of inertial navigation system data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103778627A (en) * 2014-01-02 2014-05-07 北京理工大学 Sea oil spill detection method based on SAR image
CN106872369A (en) * 2017-02-20 2017-06-20 交通运输部水运科学研究所 The airborne hyperspectral imaging system and method for a kind of spilled oil monitoring
CN110907388A (en) * 2019-11-22 2020-03-24 光钙(上海)高科技有限公司 Oil spill type identification method based on infrared spectroscopic analysis
CN111667489A (en) * 2020-04-30 2020-09-15 华东师范大学 Cancer hyperspectral image segmentation method and system based on double-branch attention deep learning
CN111815639A (en) * 2020-07-03 2020-10-23 浙江大华技术股份有限公司 Target segmentation method and related device thereof
CN111832620A (en) * 2020-06-11 2020-10-27 桂林电子科技大学 Image emotion classification method based on double-attention multilayer feature fusion
CN212008943U (en) * 2020-03-26 2020-11-24 北京农业信息技术研究中心 High-flux three-dimensional scanning spectral imaging measuring device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103778627A (en) * 2014-01-02 2014-05-07 北京理工大学 Sea oil spill detection method based on SAR image
CN106872369A (en) * 2017-02-20 2017-06-20 交通运输部水运科学研究所 The airborne hyperspectral imaging system and method for a kind of spilled oil monitoring
CN110907388A (en) * 2019-11-22 2020-03-24 光钙(上海)高科技有限公司 Oil spill type identification method based on infrared spectroscopic analysis
CN212008943U (en) * 2020-03-26 2020-11-24 北京农业信息技术研究中心 High-flux three-dimensional scanning spectral imaging measuring device
CN111667489A (en) * 2020-04-30 2020-09-15 华东师范大学 Cancer hyperspectral image segmentation method and system based on double-branch attention deep learning
CN111832620A (en) * 2020-06-11 2020-10-27 桂林电子科技大学 Image emotion classification method based on double-attention multilayer feature fusion
CN111815639A (en) * 2020-07-03 2020-10-23 浙江大华技术股份有限公司 Target segmentation method and related device thereof

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
DABBIRU L,SAMIAPPAN1 S,NOBREGA RAA,AANSTOOS JA,YOUNAN NH: "Fusion of synthetic aperture radar and hyperspectral_imagery to detect impacts of oil spill in Gulf of Mexico", 《IEEE INTERNATIONAL SYMPOSIUM ON GEOSCIENCE AND REMOTE SENSING IGARSS》 *
M.LENNON,N.THOMAS,V.MARIETTE,S.BABICHENKO,G.MERCIER: "Oil Slick Detection and Characterization by Satellite and Airborne Sensors:Experimental Results with SAR, Hyperspectral and Lidar Data", 《IEEE INTERNATIONAL SYMPOSIUM ON GEOSCIENCE AND REMOTE SENSING》 *
刘忠雨: "《深入浅出图神经网络 GNN原理解析》", 30 April 2020 *
夏威: "基于遥感技术的海上溢油应急响应", 《中国水运》 *
张建斌 等: "《海上溢油应急辅助决策技术》", 31 December 2012 *
张杰 等: "《海洋遥感探测技术与应用》", 31 August 2017 *
张煜洲,陈志莉,胡潭高,张登荣: "遥感技术监测海上溢油现状及趋势", 《杭州师范大学学报(自然科学版)》 *
杨博雄 等: "《基于高性能计算的深度学习理论与实践研究》", 3 December 2019 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2621668A (en) * 2022-06-02 2024-02-21 Univ Zhejiang System and method for acquiring hyperspectral image on the basis of inertial navigation system data

Similar Documents

Publication Publication Date Title
US11022541B2 (en) Polarimetric detection of foreign fluids on surfaces
US7916933B2 (en) Automatic target recognition system for detection and classification of objects in water
CN111781146B (en) Wave parameter inversion method using high-resolution satellite optical image
JPH1183478A (en) Global information supply system
Liu et al. A downscaled bathymetric mapping approach combining multitemporal Landsat-8 and high spatial resolution imagery: Demonstrations from clear to turbid waters
CN110703244A (en) Method and device for identifying urban water body based on remote sensing data
Wang et al. Sea ice classification with convolutional neural networks using Sentinel-L scansar images
Nuthammachot et al. Fusion of Sentinel-1A and Landsat-8 images for improving land use/land cover classification in Songkla Province, Thailand.
CN113156420A (en) Oil spill detection system and method
CN111561916B (en) Shallow sea water depth uncontrolled extraction method based on four-waveband multispectral remote sensing image
Bostater Jr et al. Hyperspectral signatures and WorldView-3 imagery of Indian River Lagoon and Banana River Estuarine water and bottom types
Xu et al. Marine radar oil spill monitoring technology based on dual-threshold and c–v level set methods
Cao et al. ICESAT-2 shallow bathymetric mapping based on a size and direction adaptive filtering algorithm
Loomis et al. Depth derivation from the WorldView-2 satellite using hyperspectral imagery
Bereta et al. Vessel traffic density maps based on vessel detection in satellite imagery
Bulgin et al. Improving the combined use of reflectance and thermal channels for ocean and coastal cloud detection for the Sea and Land Surface Temperature Radiometer (SLSTR)
Amin et al. Automated detection and removal of cloud shadows on HICO images
Imperatore et al. Contribution of super resolution to 3D reconstruction from pairs of satellite images
Vargas et al. Dense bathymetry in turbid coastal zones using airborne hyperspectral images
Good et al. Absolute airborne thermal SST measurements and satellite data analysis from the deepwater horizon oil spill
Sarala et al. Digital image processing–a remote sensing perspective
Gelautz et al. Automated matching experiments with different kinds of SAR imagery
Couchman-Crook et al. NovaSAR and SSTL S1-4: SAR and EO Data Fusion
Maier Direct multispectral photogrammetry for UAV-based snow depth measurements
Papa et al. GAMES

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210723

RJ01 Rejection of invention patent application after publication