CN117372893A - Flood disaster assessment method based on improved remote sensing image feature matching algorithm - Google Patents

Flood disaster assessment method based on improved remote sensing image feature matching algorithm Download PDF

Info

Publication number
CN117372893A
CN117372893A CN202310069813.5A CN202310069813A CN117372893A CN 117372893 A CN117372893 A CN 117372893A CN 202310069813 A CN202310069813 A CN 202310069813A CN 117372893 A CN117372893 A CN 117372893A
Authority
CN
China
Prior art keywords
image
feature
disaster
images
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310069813.5A
Other languages
Chinese (zh)
Inventor
王龙宝
杨青青
徐淑芳
储洪强
毛莺池
徐荟华
栾茵琪
张珞弦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202310069813.5A priority Critical patent/CN117372893A/en
Publication of CN117372893A publication Critical patent/CN117372893A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A10/00TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
    • Y02A10/40Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Remote Sensing (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a flood disaster assessment method based on an improved remote sensing image feature matching algorithm, which comprises the following steps: collecting images of different disaster receiving times in the same area through an unmanned aerial vehicle, determining images shot before flood occurrence as template images, and determining images in the middle and later periods of the disaster as transformation images; SURF feature point detection is carried out according to the two collected remote sensing images; describing the detected feature points by using BRIEF feature descriptors, and judging the robustness of the descriptors to the scale and illumination after the disaster image is added with noise under the influence of natural environment; selecting an improved SURF algorithm as feature point matching for the template image and the transformation image; and comparing the coincident feature points, automatically extracting a flooding area range on the remote sensing image, accurately identifying the change of the flooding area, extracting the information of the flooding change according to the matching result, and quantifying the degree of the flooding disaster.

Description

Flood disaster assessment method based on improved remote sensing image feature matching algorithm
Technical Field
The invention relates to the field of monitoring and evaluation after natural disasters, in particular to a flood disaster evaluation method based on an improved remote sensing image feature matching algorithm.
Background
Flood disasters are one of the most serious natural disasters in the world, account for 40% of all natural disasters worldwide, and are often distributed in places with dense population, high agricultural cultivation degree, concentrated rivers and lakes and abundant rainfall. Most areas in China are concentrated in rainfall time and high in intensity under the influence of continental monsoon climate, so that storm flood is one of the most main sources of flood disasters. Flood disasters often cause serious threats to national economy and life and property safety of people due to the characteristics of wide range, high frequency, strong burst, large loss and the like. The monitoring of the risk of flood disasters and the vulnerability of disaster-bearing bodies is emphasized, the construction of disaster prediction and assessment systems is enhanced, the construction of disaster prevention and reduction facilities is urgent in current disaster prevention work and disaster research, and satellite remote sensing images gradually become an important means in modern flood monitoring and assessment work due to the characteristics of high speed, high timeliness, wide field of view and the like.
The feature extraction and matching of the remote sensing images before and after the disaster are the basis and key of disaster monitoring and evaluation. In recent years, the intelligent application of a computer in the remote sensing field and the rapid development of an image feature extraction method enable related researchers to rapidly and accurately extract flooding range and disaster-affected body distribution information of different features in a complex environment. The steps of the image feature extraction method are feature point detection, feature point description and feature point matching, and most of existing methods abstract features such as textures and colors of images from the angles of distinguishing and invariance, however, when the three steps are applied to flood disaster feature information rapid extraction and disaster object rapid identification of remote sensing images, different defects are faced.
Firstly, in a characteristic point detection stage, for characteristic point positioning of a remote sensing image, a more optimized filter design and sampling method are not considered, a large number of pre-disaster, during-disaster and post-disaster remote sensing images are required to be comprehensively compared and matched for monitoring and evaluating a flood disaster, and the matching speed is reduced due to huge calculated amount; secondly, whether a remote sensing image feature descriptor with lower dimension can be constructed for feature description will influence the memory space of the system to a great extent, thereby increasing the system overhead; finally, in the matching stage, the existing image matching method is not high in matching precision, and cannot achieve good effects in practical disaster evaluation application. In addition, when using the remote sensing image feature descriptors, in order to process various disaster transformation situations robustly, and analyze the differences of geometric structures and spectral characteristics of the flooding water body and the disaster-stricken body, how to optimize the details and simultaneously maintain good invariance and robustness of image matching is also important.
Therefore, a high-real-time remote sensing image matching method with stable characteristics and unchanged rotation, scale transformation and illumination has become the first choice method for rapidly evaluating flood conditions.
Disclosure of Invention
The invention aims to: in order to overcome the defects in the prior art, the flood disaster assessment method based on the improved remote sensing image feature matching algorithm is provided, mainly by utilizing multi-temporal remote sensing images, the change features of disaster inundation areas in remote sensing images of different periods are identified by comparing the feature point changes of the remote sensing images before, during and after disaster in a disaster-affected area, and the dynamic monitoring and the post-disaster influence assessment of disaster processes are realized.
The technical scheme is as follows: in order to achieve the above purpose, the invention provides a flood disaster assessment method based on an improved remote sensing image feature matching algorithm, which comprises the following steps:
s1: the unmanned aerial vehicle remote sensing technology can shoot the same area with different angles and different layers for many times, the current situation of a flood disaster occurrence site can be obtained by carrying out data analysis on the shot photos, images with different disaster receiving times in the same area are acquired through the unmanned aerial vehicle, the shot images before the flood disaster occurs are determined as template images, and the images in the middle and later stages of the disaster are determined as transformation images;
s2: according to the two collected remote sensing images, SURF feature point detection is carried out, a SURF algorithm adopts a quick Hessian algorithm to detect key points, and a SURF operator describes the condition of the surrounding area of the key points through a feature vector;
s3: describing the detected feature points by using BRIEF feature descriptors, and judging the robustness of the descriptors to the scale and illumination after the disaster image is added with noise under the influence of natural environment;
s4: selecting an improved SURF algorithm as feature point matching for the template image and the transformation image;
s5: comparing the number of the coincident characteristic points, respectively carrying out full connection on all the characteristic points detected by the template image and the transformation image, wherein the maximum area of the connection result is the flood disaster inundation area range, and because the template image represents the image before the flood disaster occurs, the flood disaster inundation area range in the transformation image after the template image is recorded in sequence according to the time sequence, the area is calculated, whether the water level of the flood disaster rises is observed through the numerical change of the area, if the area is enlarged, the water level rises is indicated, the water level is required to be reported upwards in time, and meanwhile, the disaster comprehensive risk area is defined and early warning is carried out.
Further, the specific operation of SURF feature point detection in step S2 is as follows:
s21: constructing a Hessian matrix, calculating a Hessian matrix for each pixel point in the disaster image, wherein the matrix is a second derivative matrix in the x and y directions, and can measure the local curvature of a function, the determinant value of the Hessian matrix represents the variation quantity around the pixel point, and the characteristic point needs to take the extreme point of the determinant value. The square filter is used for replacing a Gaussian filter in SIFT, and the operation speed is greatly improved by using an integral graph (calculating four corner values of the square filter);
s22: constructing an image pyramid according to the generated multiple Hessian determinant images;
s23: characteristic point positioning, namely comparing the intensity of different characteristic points, and selecting a maximum value or a minimum value point as a preliminarily determined key point;
s24: the principal direction assignment of the feature points is determined by calculation of Haar wavelet responses.
The specific operation in step S21 is as follows:
s211: for flood images shot by unmanned aerial vehicles, the detected characteristic points need to have scale independence, rotation invariance and view angle invariance, so before a Hessian matrix is constructed, gaussian filtering is firstly carried out, convolution of a Gaussian kernel G (t) and the images at a point x is utilized, and the formula is as follows:
L(x,t)=G(t)·I(x,t)
where L (x, t) is the representation of an image at different resolutions, representing the convolution operation, and t is the gaussian variance.
S212: because the discrete pixel points are formed by template convolution, in order to promote the speed of emergency assessment in the flood disaster environment, two steps of Gaussian smoothing before Hessian and second derivative solving are combined into one, a box filter is used for approximate replacement, and an integral image is used for evaluation;
s213: the value of the discriminant is the eigenvalue of the H matrix. All points can be classified by the sign of the determination result. Judging whether the point is an extreme point or not according to the positive and negative values of the judgment type, and for an image f (x, y), the Hessian matrix is as follows:
the Hessian matrix discriminant is:
det(H approx )=D xx D yy -(ωD xy ) 2
wherein D is xx ,D xy And D yy To approximate convolution value obtained with box filter, D xx Represents the second partial derivative of the current point to the horizontal direction, D yy Is the second partial derivative in the vertical direction, D xy Is the current horizontal and vertical second order bias.
F (x, y) in the Hessian matrix discriminant is the gaussian convolution of the original image, and since gaussian verification obeys normal distribution, the coefficients are lower and lower from the center point outwards, and in order to increase the operation speed, SURF uses a box filter to approximately replace the gaussian filter, so in D xy A weighting factor of 0.9 is multiplied in order to balance the error due to the use of the box filter approximation.
The construction of the image pyramid is divided into two parts:
s221: carrying out Gaussian blur on the images with different scales;
s222: the image is downsampled (spaced-point sampling).
Increasing the size of the template image and performing point sampling calculation is equivalent to realizing downsampling of the image and changing the size of the template scale, so that the downsampling process is saved, and the emergency evaluation speed is improved.
The Haar wavelet response is calculated as:
the SURF algorithm, in contrast to the SIFT algorithm, does not count its gradient histogram, but rather counts Haar wavelet features in the feature point domain in order to ensure rotational invariance. In the neighborhood with the radius of 6S (S is the scale value of the feature point) is calculated by taking the feature point as the center, the sum of Haar wavelet responses of all points in the 60-degree fan in the x (horizontal) and y (vertical) directions (the Haar wavelet side length is 4S) is counted, gaussian weight coefficients are given to the response values, so that the response contribution close to the feature point is large, the response contribution far from the feature point is small, then the responses in the 60-degree range are added to form a new vector, the whole circular area is traversed, and the direction of the longest vector is selected as the main direction of the feature point.
Since Haar eigenvalues reflect the situation of image gray level changes, this main direction is the direction describing those areas where gray level changes are particularly severe, i.e. the direction of areas where flood disasters occur.
Taking a square frame around the feature points, wherein the direction of the square frame is the main direction of the feature points, dividing the square frame into 16 sub-areas, counting Haar wavelet characteristics in the horizontal direction and the vertical direction in each sub-area, calculating the sum of horizontal direction values, the sum of horizontal direction absolute values, the sum of vertical direction values and the sum of vertical direction absolute values of the Haar wavelet characteristics in each sub-area, and forming a 16 multiplied by 4=64-dimensional feature vector.
Further, the specific operation described for the feature points in the step S3 is as follows:
once the feature points are detected, effective feature descriptors are selected for representation, and for each feature point, n point pairs selected randomly in 4S in the neighborhood around the feature point are utilized to generate corresponding descriptors, S is the scale factor of the feature point, and the descriptors are binary code strings with the length of n. Recording the position, scale, direction information and binary comparison information of the feature points as descriptors of the feature points, and recording the position, scale, direction information and binary comparison information of the feature points (x i ,y i ) I=1, 2,..2n constitutes a matrix a of 2×n:
computing a neighborhood rotation matrix R using a principal direction θ θ
And rotates A to obtain rotated version A θ Obtaining A θ =R θ A。
Because flood disaster images acquired by unmanned aerial vehicles need to monitor the change conditions of the same area in a plurality of time periods, and the images before and after the change are respectively affected by noise, visual perception is illumination and scale change. The linear variation of noise can be described as follows:
change in scale on gray scale:
gray scale generation increment:
simultaneous dimensional and incremental changes occur:
where u represents original image RGB, C represents changed image RGB, o1 is increment, and a is scale factor.
Further, the operation steps of the step S4 are as follows:
s41: matching the binary descriptors by utilizing FLANN;
s42: introducing a mechanism for filtering error matching, screening the error matching by utilizing the Hamming distance, selecting that the Hamming distance of the matched point pairs is smaller than twice of the minimum distance as a judgment basis, and if the Hamming distance is smaller than the minimum distance, regarding the point pairs as error matching, and filtering; greater than this value is considered a correct match.
The formula for the hamming distance in step S42 is:
where i=0, 1,..n-1, x, y are all n-bit codes,representing exclusive or.
The beneficial effects are that: the algorithm SURF (Speeded Up Robust Features) used in the present invention is a robust image recognition and description algorithm. The method is an efficient variant of SIFT, is used for extracting scale invariant features, has the algorithm steps approximately the same as those of the SIFT algorithm, but adopts different methods, and is more efficient than the SIFT algorithm to obtain a feature stability, and has the following advantages:
1. SIFT is time-consuming in constructing DOG pyramid and solving DOG local spatial extremum, SURF is improved by using Hessian matrix transformation image, extremum detection only needs to calculate Hessian matrix determinant, hessian determinant approximation can be solved by using a simple equation as further optimization, and Gaussian blur approximation is solved by using box-like blur filtering (box blur).
2. SURF does not use downsampling and builds a scale pyramid by keeping the image size unchanged but changing the box filter size.
3. In the method of calculating the principal direction of the keypoint and the direction of pixels around the keypoint, SURF does not use histogram statistics, but uses Haar wavelet transform. The KPD of SIFT reaches 128 dimensions, which results in time consumption of KPD, SURF uses the direction obtained by Haar wavelet conversion, so that the KPD of SURF is reduced to 64 dimensions, half is reduced, and the matching speed is improved.
If we say that the SIFT algorithm uses DOG to simplify LOG and increase the speed of searching feature points, then the SURF algorithm is a simplification and approximation of DOH. Although the SIFT algorithm has been considered the most efficient and most commonly used algorithm for feature point extraction, it is still difficult to achieve real-time with existing computers without the aid of hardware acceleration and specialized image processor cooperation. For the occasion that real-time operation is needed, such as a real-time target tracking system based on feature point matching, 8-24 frames of images are processed per second, and the work of searching feature points, generating feature vectors, matching feature vectors, locking targets and the like is needed to be completed within millisecond, so that the SIFT algorithm is difficult to adapt to the requirement. SURF uses the idea of simplifying approximation in SIFT to simplify the Gaussian second-order differential template in DoH, so that the filtering of the template to the image only needs to carry out a few simple addition and subtraction operations, and the operations are irrelevant to the scale of the filter.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Fig. 2 is a detailed schematic of the improved algorithm of the present invention.
Fig. 3 is a schematic diagram of the principal direction of SURF feature points.
Fig. 4 is a schematic view of SURF scale space.
Detailed Description
The present invention is further illustrated in the accompanying drawings and detailed description which are to be understood as being merely illustrative of the invention and not limiting of its scope, and various modifications of the invention, which are equivalent to those skilled in the art upon reading the invention, will fall within the scope of the invention as defined in the appended claims.
The invention provides a flood disaster assessment method based on an improved remote sensing image feature matching algorithm, which is shown in fig. 1 and comprises the following steps:
s1: collecting images of different disaster receiving times in the same area through an unmanned aerial vehicle, determining images shot before flood occurrence as template images, and determining images in the middle and later periods of the disaster as transformation images;
s2: SURF feature point detection is carried out according to the two collected remote sensing images;
the specific steps of detection are as follows:
s21: constructing a Hessian matrix, and calculating a Hessian matrix for each pixel point in the disaster image;
s211: for flood images shot by unmanned aerial vehicles, the detected characteristic points need to have scale independence, rotation invariance and view angle invariance, so before a Hessian matrix is constructed, gaussian filtering is firstly carried out, convolution of a Gaussian kernel G (t) and the images at a point x is utilized, and the formula is as follows:
L(x,t)=G(t)·I(x,t)
where L (x, t) is the representation of an image at different resolutions, representing the convolution operation, and t is the gaussian variance.
S212: because the discrete pixel points are formed by template convolution, in order to promote the speed of emergency assessment in the flood disaster environment, two steps of Gaussian smoothing before Hessian and second derivative solving are combined into one, a box filter is used for approximate replacement, and an integral image is used for evaluation;
s213: judging whether the point is an extreme point or not through positive and negative values of a Hessian matrix discriminant, wherein the discriminant is as follows:
det(H approx )=D xx D yy -(ωD xy ) 2
wherein D is xx ,D xy And D yy To approximate convolution value obtained with box filter, D xx Represents the second partial derivative of the current point to the horizontal direction, D yy Is the second partial derivative in the vertical direction, D xy For the current horizontal and vertical second order bias, in order to balance the error caused by using box filter approximation, a weighting coefficient ω=0.9 is introduced.
S22: constructing an image pyramid according to the generated multiple Hessian determinant images;
the construction of the image pyramid is divided into two parts:
s221: carrying out Gaussian blur on the images with different scales;
s222: the image is downsampled (spaced-point sampling).
Increasing the size of the template image and performing point sampling calculation is equivalent to realizing downsampling of the image and changing the size of the template scale, so that the downsampling process is saved, and the emergency evaluation speed is improved.
S23: characteristic point positioning, namely comparing the intensity of different characteristic points, and selecting a maximum value or a minimum value point as a preliminarily determined key point;
s24: the principal direction assignment of the feature points is determined by calculation of Haar wavelet responses.
The Haar wavelet response is calculated as:
the SURF algorithm, in contrast to the SIFT algorithm, does not count its gradient histogram, but rather counts Haar wavelet features in the feature point domain in order to ensure rotational invariance. In the neighborhood with the radius of 6S (S is the scale value of the feature point) is calculated by taking the feature point as the center, the sum of Haar wavelet responses of all points in the 60-degree fan in the x (horizontal) and y (vertical) directions (the Haar wavelet side length is 4S) is counted, gaussian weight coefficients are given to the response values, so that the response contribution close to the feature point is large, the response contribution far from the feature point is small, then the responses in the 60-degree range are added to form a new vector, the whole circular area is traversed, and the direction of the longest vector is selected as the main direction of the feature point.
Since Haar eigenvalues reflect the situation of image gray level changes, this main direction is the direction describing those areas where gray level changes are particularly severe, i.e. the direction of areas where flood disasters occur.
Taking a square frame around the feature points, wherein the direction of the square frame is the main direction of the feature points, dividing the square frame into 16 sub-areas, counting Haar wavelet characteristics in the horizontal direction and the vertical direction in each sub-area, calculating the sum of horizontal direction values, the sum of horizontal direction absolute values, the sum of vertical direction values and the sum of vertical direction absolute values of the Haar wavelet characteristics in each sub-area, and forming a 16 multiplied by 4=64-dimensional feature vector.
S3: describing the detected feature points by using BRIEF feature descriptors, and judging the robustness of the descriptors to the scale and illumination after the disaster image is added with noise under the influence of natural environment;
once the feature points are detected, a valid feature descriptor is selected for representation, and for each feature point, a random selection of n pairs of points within 4s x 4s in its surrounding neighborhood is used to generate itAnd the corresponding descriptor, S is the scale factor of the feature point, so that the descriptor is a binary code string with the length of n. Recording the position, scale, direction information and binary comparison information of the feature points as descriptors of the feature points, and recording the position, scale, direction information and binary comparison information of the feature points (x i ,y i ) I=1, 2,..2n constitutes a matrix a of 2×n:
computing a neighborhood rotation matrix R using a principal direction θ θ
And rotates A to obtain rotated version A θ Obtaining A θ =R θ A。
Because flood disaster images acquired by unmanned aerial vehicles need to monitor the change conditions of the same area in a plurality of time periods, and the images before and after the change are respectively affected by noise, visual perception is illumination and scale change. The linear variation of noise can be described as follows:
change in scale on gray scale:
gray scale generation increment:
simultaneous dimensional and incremental changes occur:
where u represents original image RGB, C represents changed image RGB, o1 is increment, and a is scale factor.
S4: selecting an improved SURF algorithm as feature point matching for the template image and the transformation image;
the operation steps are as follows:
s41: matching the binary descriptors by utilizing FLANN;
s42: screening error matching by utilizing the Hamming distance, selecting that the Hamming distance of the matched point pair is smaller than twice of the minimum distance as a judgment basis, and if the Hamming distance is smaller than the minimum distance, regarding the point pair as error matching and filtering; greater than this value is considered a correct match.
The hamming distance formula is:
where i=0, 1,..n-1, x, y are all n-bit codes,representing exclusive or.
S5: comparing the number of the coincident characteristic points, respectively carrying out full connection on all the characteristic points detected by the template image and the transformation image, wherein the maximum area of the connection result is the flood disaster inundation area range, and because the template image represents the image before the flood disaster occurs, the flood disaster inundation area range in the transformation image after the template image is recorded in sequence according to the time sequence, the area is calculated, whether the water level of the flood disaster rises is observed through the numerical change of the area, if the area is enlarged, the water level rises is indicated, the water level is required to be reported upwards in time, and meanwhile, the disaster comprehensive risk area is defined and early warning is carried out.

Claims (8)

1. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm is characterized by comprising the following steps of:
s1: collecting images of different disaster receiving times in the same area through an unmanned aerial vehicle, determining images shot before flood occurrence as template images, and determining images in the middle and later periods of the disaster as transformation images;
s2: SURF feature point detection is carried out according to the two collected remote sensing images;
s3: describing the detected feature points by using BRIEF feature descriptors, and judging the robustness of the descriptors to the scale and illumination after the disaster image is added with noise under the influence of natural environment;
s4: selecting an improved SURF algorithm as feature point matching for the template image and the transformation image;
s5: and comparing the coincident feature points, automatically extracting the submerged area range on the remote sensing image, accurately identifying the change of the flood area, and quantifying the degree of the flood disaster according to the matching result.
2. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 1, wherein the specific steps of SURF feature point detection in step S2 are as follows:
s21: constructing a Hessian matrix, and calculating a Hessian matrix for each pixel point in the disaster image;
s22: constructing an image pyramid according to the generated multiple Hessian determinant images;
s23: characteristic point positioning, namely comparing the intensity of different characteristic points, and selecting a maximum value or a minimum value point as a preliminarily determined key point;
s24: the principal direction assignment of the feature points is determined by calculation of Haar wavelet responses.
3. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 2, wherein the specific operations in step S21 are as follows:
s211: for flood images shot by unmanned aerial vehicles, the detected characteristic points need to have scale independence, rotation invariance and view angle invariance, so before a Hessian matrix is constructed, gaussian filtering is firstly carried out, convolution of a Gaussian kernel G (t) and an image function I (x) at a point x is utilized, and the formula is as follows:
L(x,t)=G(t)·I(x,t)
where L (x, t) is the representation of an image at different resolutions, representing the convolution operation, and t is the gaussian variance.
S212: because the discrete pixel points are formed by template convolution, in order to promote the speed of emergency assessment in the flood disaster environment, two steps of Gaussian smoothing before Hessian and second derivative solving are combined into one, a box filter is used for approximate replacement, and an integral image is used for evaluation;
s213: judging whether the point is an extreme point or not through positive and negative values of a Hessian matrix discriminant, wherein the discriminant is as follows:
det(H approx )=D xx D yy -(ωD xy ) 2
wherein D is xx ,D xy And D yy To approximate convolution value obtained with box filter, D xx Represents the second partial derivative of the current point to the horizontal direction, D yy Is the second partial derivative in the vertical direction, D xy For the current horizontal and vertical second order bias, in order to balance the error caused by using box filter approximation, a weighting coefficient ω=0.9 is introduced.
4. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 2, wherein the construction of the image pyramid in step S22 is divided into two parts:
s221: carrying out Gaussian blur on the images with different scales;
s222: the image is downsampled (spaced-point sampling).
Increasing the size of the template image and performing point sampling calculation is equivalent to realizing downsampling of the image and changing the size of the template scale, so that the downsampling process is saved, and the emergency evaluation speed is improved.
5. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 2, wherein the calculation formula of the Haar wavelet response in step S24 is as follows:
since Haar eigenvalues reflect the situation of image gray level changes, this main direction is the direction describing those areas where gray level changes are particularly severe, i.e. the direction of areas where flood disasters occur.
6. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 1, wherein the specific operation described for the feature points in the step S3 is as follows:
once the feature points are detected, effective feature descriptors are selected for representation, and for each feature point, n point pairs selected randomly in 4S in the neighborhood around the feature point are utilized to generate corresponding descriptors, S is the scale factor of the feature point, and the descriptors are binary code strings with the length of n. Recording the position, scale, direction information and binary comparison information of the feature points as descriptors of the feature points, and recording the position, scale, direction information and binary comparison information of the feature points (x i ,y i ) I=1, 2,..2n constitutes a matrix a of 2×n:
computing a neighborhood rotation matrix R using a principal direction θ θ
And rotates A to obtain rotated version A θ Obtaining A θ =R θ A。
Because flood disaster images acquired by unmanned aerial vehicles need to monitor the change conditions of the same area in a plurality of time periods, and the images before and after the change are respectively affected by noise, visual perception is illumination and scale change. The linear variation of noise can be described as follows:
change in scale on gray scale:
gray scale generation increment:
simultaneous dimensional and incremental changes occur:
where u represents original image RGB, C represents changed image RGB, o1 is an increment, and a is a change scale.
7. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm as claimed in claim 1, wherein the operation step of step S4 is as follows:
s41: matching the binary descriptors by utilizing FLANN;
s42: screening error matching by utilizing the Hamming distance, selecting that the Hamming distance of the matched point pair is smaller than twice of the minimum distance as a judgment basis, and if the Hamming distance is smaller than the minimum distance, regarding the point pair as error matching and filtering; greater than this value is considered a correct match. The hamming distance formula is:
where i=0, 1,..n-1, x, y are all n-bit codes,representing exclusive or.
8. The flood disaster assessment method based on the improved remote sensing image feature matching algorithm according to claim 7, wherein the operation of step S5 is as follows:
all the characteristic points detected by the template image and the transformation image are respectively fully connected, the largest area of the connecting result is the flood disaster inundation area range, and the template image represents the image before the flood disaster occurs, so that the flood disaster inundation area range in the transformation image after the template image is recorded in sequence according to the time sequence, the area is calculated, the range change of the inundation area is observed through the numerical value change of the area, if the range is enlarged, the range is required to be reported upwards in time, and meanwhile, the disaster comprehensive risk area is defined and early warning is carried out.
CN202310069813.5A 2023-02-03 2023-02-03 Flood disaster assessment method based on improved remote sensing image feature matching algorithm Pending CN117372893A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310069813.5A CN117372893A (en) 2023-02-03 2023-02-03 Flood disaster assessment method based on improved remote sensing image feature matching algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310069813.5A CN117372893A (en) 2023-02-03 2023-02-03 Flood disaster assessment method based on improved remote sensing image feature matching algorithm

Publications (1)

Publication Number Publication Date
CN117372893A true CN117372893A (en) 2024-01-09

Family

ID=89398971

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310069813.5A Pending CN117372893A (en) 2023-02-03 2023-02-03 Flood disaster assessment method based on improved remote sensing image feature matching algorithm

Country Status (1)

Country Link
CN (1) CN117372893A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700399A (en) * 2015-01-08 2015-06-10 东北大学 Method for demarcating large-deformation landslide displacement field based on high-resolution remote sensing image
CN106529591A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 Improved MSER image matching algorithm
CN109934857A (en) * 2019-03-04 2019-06-25 大连理工大学 A kind of winding detection method based on convolutional neural networks Yu ORB feature
CN111080529A (en) * 2019-12-23 2020-04-28 大连理工大学 Unmanned aerial vehicle aerial image splicing method for enhancing robustness
CN115331000A (en) * 2022-08-01 2022-11-11 成都唐源电气股份有限公司 ORB algorithm-based bow net running state detection method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104700399A (en) * 2015-01-08 2015-06-10 东北大学 Method for demarcating large-deformation landslide displacement field based on high-resolution remote sensing image
CN106529591A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 Improved MSER image matching algorithm
CN109934857A (en) * 2019-03-04 2019-06-25 大连理工大学 A kind of winding detection method based on convolutional neural networks Yu ORB feature
CN111080529A (en) * 2019-12-23 2020-04-28 大连理工大学 Unmanned aerial vehicle aerial image splicing method for enhancing robustness
CN115331000A (en) * 2022-08-01 2022-11-11 成都唐源电气股份有限公司 ORB algorithm-based bow net running state detection method

Similar Documents

Publication Publication Date Title
CN108319964B (en) Fire image recognition method based on mixed features and manifold learning
Yin et al. Hot region selection based on selective search and modified fuzzy C-means in remote sensing images
CN103077512B (en) Based on the feature extracting and matching method of the digital picture that major component is analysed
Hou et al. Three-order tucker decomposition and reconstruction detector for unsupervised hyperspectral change detection
CN111563896B (en) Image processing method for detecting abnormality of overhead line system
CN110598613B (en) Expressway agglomerate fog monitoring method
CN112308873B (en) Edge detection method for multi-scale Gabor wavelet PCA fusion image
CN110222661B (en) Feature extraction method for moving target identification and tracking
Dalmiya et al. A novel feature descriptor for automatic change detection in remote sensing images
CN104680554A (en) SURF-based compression tracing method and system
CN114821358A (en) Optical remote sensing image marine ship target extraction and identification method
CN112990314A (en) Hyperspectral image anomaly detection method and device based on improved isolated forest algorithm
CN113221759A (en) Road scattering identification method and device based on anomaly detection model
CN116912574A (en) Multi-scale target perception classification method and system based on twin network
CN115829942A (en) Electronic circuit defect detection method based on non-negative constraint sparse self-encoder
CN112784777B (en) Unsupervised hyperspectral image change detection method based on countermeasure learning
CN117372893A (en) Flood disaster assessment method based on improved remote sensing image feature matching algorithm
CN116206139A (en) Unmanned aerial vehicle image upscaling matching method based on local self-convolution
CN115512310A (en) Vehicle type recognition method and system based on face features under video monitoring
CN109934103A (en) Method based on obvious object in dark channel prior and region covariance detection image
CN115861669A (en) Infrared dim target detection method based on clustering idea
CN115331078A (en) ESR-YOLOv 5-based optical remote sensing image target detection method
CN115393706A (en) SAR image change detection method based on unsupervised space-frequency characterization learning fusion
CN115019201A (en) Weak and small target detection method based on feature refined depth network
CN113822361A (en) SAR image similarity measurement method and system based on Hamming distance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination