CN117854256B - Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis - Google Patents

Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis Download PDF

Info

Publication number
CN117854256B
CN117854256B CN202410250306.6A CN202410250306A CN117854256B CN 117854256 B CN117854256 B CN 117854256B CN 202410250306 A CN202410250306 A CN 202410250306A CN 117854256 B CN117854256 B CN 117854256B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
image
monitoring
steps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410250306.6A
Other languages
Chinese (zh)
Other versions
CN117854256A (en
Inventor
黄健
肖金武
王东坡
闫帅星
肖先煊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Univeristy of Technology
Original Assignee
Chengdu Univeristy of Technology
Filing date
Publication date
Application filed by Chengdu Univeristy of Technology filed Critical Chengdu Univeristy of Technology
Priority to CN202410250306.6A priority Critical patent/CN117854256B/en
Publication of CN117854256A publication Critical patent/CN117854256A/en
Application granted granted Critical
Publication of CN117854256B publication Critical patent/CN117854256B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application discloses a geological disaster monitoring method based on unmanned aerial vehicle video stream analysis. A geological disaster monitoring method based on unmanned aerial vehicle video stream analysis comprises the following steps: step 1: arranging the unmanned aerial vehicle in a corresponding airspace of a monitoring area, aligning a camera of the unmanned aerial vehicle to the area to be monitored, and collecting video data of the area to be detected in real time; step 2: analyzing the video data into continuous image frames according to time sequence, carrying out gray processing on each frame of image, and converting the gray processing into gray images; step 3: and sequentially calculating the motion speed and the motion direction of the pixel points between two adjacent frames based on an optical flow calculation method. According to the technical scheme provided by the application, the unmanned aerial vehicle is arranged in the target airspace corresponding to the monitoring area, so that the target area can be monitored by adopting a proper angle, and the problem that the observation angle is poor due to the fact that the monitoring image is right above the mountain is avoided.

Description

Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis
Technical Field
The application relates to the field of geological disaster monitoring, in particular to a geological disaster monitoring method based on unmanned aerial vehicle video stream analysis.
Background
Landslide is a very serious geological disaster, and the existing landslide monitoring mode is based on remote sensing satellites to monitor three bodies. However, the monitoring mode is high in difficulty and low in accuracy, because the picture acquired by the remote sensing satellite is biased to be right above the three bodies, early warning cannot be sent out when the mountain body is in a landslide, and early warning can be found and sent out only after the mountain body slides and a larger shape change occurs in the mountain body. Therefore, the monitoring work of landslide is low in timeliness, and early warning of the landslide cannot be provided rapidly and accurately.
Disclosure of Invention
The summary of the application is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. The summary of the application is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
As a first aspect of the present application, in order to solve the technical problems mentioned in the background section above, some embodiments of the present application provide a geological disaster monitoring method based on unmanned aerial vehicle video stream analysis, including the steps of:
step 1: arranging the unmanned aerial vehicle in a corresponding airspace of a monitoring area, aligning a camera of the unmanned aerial vehicle to the area to be monitored, and collecting video data of the area to be detected in real time;
Step 2: analyzing the video data into continuous image frames according to time sequence, carrying out gray processing on each frame of image, and converting the gray processing into gray images;
step 3: sequentially calculating the motion speed and the motion direction of pixel points between two adjacent frames based on an optical flow calculation method;
step 4: obtaining a motion amplitude distribution diagram of a target area according to the motion speed and the motion direction of the pixel points between two adjacent frames;
step 5: and monitoring whether landslide occurs in the target area according to the motion amplitude distribution diagram of the target area, and giving out early warning when the landslide occurs.
According to the technical scheme provided by the application, the unmanned aerial vehicle is arranged in the target airspace corresponding to the monitoring area, so that the target area can be monitored by adopting a proper angle, and the problem that the observation angle is poor due to the fact that the monitoring image is right above the mountain is avoided. Meanwhile, when the target area is monitored, a distribution map of the motion amplitude is established by utilizing images of two adjacent frames, and then when the landslide trend occurs, the landslide trend can be found out in time through the distribution map of the motion amplitude, so that early warning is sent out rapidly and accurately.
When the unmanned aerial vehicle hovers in the air, the unmanned aerial vehicle is unstable and is easily influenced by crosswind, and then after the unmanned aerial vehicle shakes, the area needing to be monitored is removed from the field of view of the camera, and aiming at the problem, the application provides the following technical scheme:
further, step 1 includes the following steps:
Step 11: determining a mountain range to be monitored according to requirements, and taking the mountain range as a target area;
Step 12: the unmanned aerial vehicle is flown to a contralateral airspace of a target area, a camera of the unmanned aerial vehicle is aligned to the target area, and the distance between the unmanned aerial vehicle and the target area and the focal length of the camera are adjusted, so that the shooting range of the camera is larger than the mountain range to be monitored;
Step 13: video data of the area to be detected is collected in real time.
According to the technical scheme provided by the application, the shooting range is larger than the mountain range to be monitored, so that when the unmanned aerial vehicle shakes due to crosswind, the area to be monitored cannot leave from the view field of the camera, and the continuity of monitoring the monitoring area is ensured.
When monitoring landslide, need long-time monitoring, and adopt many unmanned aerial vehicles to carry out the adherence in turn, can lead to different unmanned aerial vehicles unable to monitor in same position, and lead to the processing to monitoring video to go on in succession for when changing unmanned aerial vehicle at every turn, all need the initialization work of longer time. Aiming at the problem, the application provides the following technical scheme:
further, in step 1, the unmanned aerial vehicle is connected with a control center on the ground through a cable, so as to complete information transmission and power transmission.
According to the technical scheme provided by the application, the unmanned aerial vehicle is connected with the control center through the cable, and the information transmission and the power transmission are completed through the cable, so that the unmanned aerial vehicle can execute the monitoring task for a long time. And video information transmitted to the control center by the unmanned aerial vehicle is not easy to be interfered and the problem of frame loss occurs.
When monitoring landslide, the monitoring is mainly performed on whether the target in the target area is displaced or not. Under the condition that the unmanned aerial vehicle is stationary, the landslide can be regarded as the occurrence as long as displacement occurs to partial mountain of the monitoring area. However, in practice, the unmanned aerial vehicle may not be stationary, and after the unmanned aerial vehicle moves, the corresponding relationship between two adjacent frames of pictures may be dislocated, so that the accuracy of optical flow calculation is affected, and a false alarm may exist. Aiming at the problem, the application provides the following technical scheme:
further, step2 includes the steps of:
step 21: a plurality of mark points are preset in a picture range of a region to be monitored;
Step 22: analyzing the video data into continuous image frames, intercepting each image frame, and removing the image information of a non-monitoring area;
Step 23: aligning the continuous image frames according to the criterion of overlapping at least one third of marking points;
Step 24: and carrying out gray scale processing on the continuous image frames to obtain continuous gray scale images.
In the technical scheme provided by the application, when continuous image frames are processed, the image information of a non-detection area is removed, so that the image range to be processed is reduced as much as possible, the calculated amount is reduced, and the monitoring work is ensured to be carried out in real time. Meanwhile, a plurality of mark points are arranged in the monitoring area, when the unmanned aerial vehicle moves due to crosswind, the offset pictures can be aligned, and the influence on the accuracy of optical flow calculation is avoided. Further, since gradation processing is performed after successive image frames are aligned, it is possible to prevent the information of the mark points from being blurred after the gradation processing, thereby affecting the alignment accuracy.
When optical flow calculation is performed to monitor whether landslide occurs in a target area, the processing rate of information needs to be increased as much as possible to avoid that the processing rate of images is lower than the collection rate of images, so that real-time monitoring cannot be performed. However, too clear an image results in an increased amount of computation, and the processing requirements for information are too high. Too blurred images may result in reduced accuracy of the monitoring. Aiming at the problem, the application provides the following technical scheme:
Step 2 further comprises the steps of:
Step 25: the resolution of successive gray scale images is reduced to a predetermined size.
According to the technical scheme provided by the application, the resolution of the gray level image is reduced, so that the operation amount can be reduced during continuous monitoring, the real-time monitoring can be ensured, and early warning can be timely sent out when dangerous situations occur.
When landslide monitoring is carried out, the offset of an object between two adjacent frames is mainly compared, and under the condition that the minimum monitoring precision is preset, the detail characteristics of the object do not need to be high in precision, but the definition of the whole image is required to be uniformly reduced, so that the whole monitoring area is ensured to be monitored comprehensively. Aiming at the problem, the application provides the following technical scheme:
step 25 comprises the steps of:
Step 251: fourier transforming each image f 1 (x, y) of the successive gray scale images;
;
Where F (n, m) is the transformed spectrum, n and m are the abscissa and ordinate in frequency, x and y are coordinates in the spatial domain, and F 1 (x, y) represents one of the successive gray images; m and N respectively represent the width and the height of the image in the spatial domain; Is a kernel function of fourier transform, j is an imaginary unit;
Step 252: removing high-frequency components in a frequency spectrum F (n, m) by adopting a low-pass filter;
G(n,m)=H(n,m)×F(n,m);
Wherein H (n, m) is a transfer function; if D (n, m) is less than or equal to D 0, H (n, m) =1; if D (n, m) > D 0, H (n, m) =0;
Wherein D 0 is cut-off frequency, and is set by reducing resolution as required; d (n, m) is the distance of the point (n, m) from the center of the spectrum; g (n, m) represents the spectrum after the low-pass filter processing;
step 253 includes the steps of:
performing inverse Fourier operation on the filtered spectrum to obtain an image g (x, y) with a reduced distribution rate;
g (x, y) represents the reduced resolution image obtained after the inverse fourier transform, and the coordinates in the spatial domain are x and y;
Is the normalized coefficient of the inverse fourier transform;
g (n, m) represents the spectrum after processing by the low pass filter for inverse fourier transformation.
In the technical scheme provided by the application, the Fourier transform is adopted to reduce the image precision, the resolution of the image can be reduced from the global of the image, and the problem of immature global monitoring caused by the reduction of the local image definition or the uneven sampling mode is avoided.
Further, step 3 includes the following steps:
Step 31: collecting continuous gray images in real time;
Step 32: comparing the received latest gray image with the received previous gray image, and calculating the light flux of each pixel point to obtain an optical flow distribution picture.
Further, in step 32, the optical flow distribution is calculated as follows:
f (x, y) = (u (x, y), v (x, y)); x is the abscissa of the gray image, y is the ordinate of the gray image, f (x, y) is the optical flow vector at the position (x, y), u (x, y) and v (x, y) are the displacement amounts of the position (x, y) in the horizontal and vertical directions, respectively.
The scheme can smoothly and accurately complete the estimation work of the optical flow field.
Further, step 4 includes the steps of:
Step 41: collecting optical flow distribution pictures according to the sequence;
step 42: for each optical flow distribution picture, carrying out local motion change analysis, calculating through double partial derivatives to obtain the change quantity of each pixel point, and describing the change quantity by using a heat value to obtain a motion amplitude distribution diagram of the motion amplitude of each position of a monitoring area; the calculation method is as follows:
magnitude represents the motion amplitude, which is the amount obtained by combining displacement acceleration in the horizontal and vertical directions, u represents the displacement of a certain point in an image in the horizontal direction, v represents the displacement of a certain point in the image in the vertical direction, d is a differential operator, represents the calculation of a derivative or a change rate, x represents the horizontal coordinate of the image, which represents the position of a pixel point in the horizontal direction, y represents the vertical coordinate of the image, which represents the position of the pixel point in the vertical direction.
The step 5 is specifically as follows: the heat threshold value is preset, each generated motion amplitude distribution diagram is monitored in real time, and when the heat value in the motion amplitude distribution diagram exceeds the heat threshold value, an early warning signal is sent out.
The application has the beneficial effects that: according to the technical scheme provided by the application, the unmanned aerial vehicle is arranged in the target airspace corresponding to the monitoring area, so that the target area can be monitored by adopting a proper angle, and the problem that the observation angle is poor due to the fact that the monitoring image is right above the mountain is avoided.
Drawings
Fig. 1 is a flow chart of a geological disaster monitoring method based on unmanned aerial vehicle video stream analysis.
Fig. 2 is a schematic diagram of a marker point.
Fig. 3 is a graph of the amplitude of motion profile just before collapse begins.
Fig. 4 is a graph showing the amplitude distribution of motion at the beginning of collapse.
Fig. 5 is a graph of motion amplitude profile at collapse.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the application have been illustrated in the accompanying drawings, it is to be understood that the application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the application are for illustration purposes only and are not intended to limit the scope of the present application.
It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings. Embodiments of the application and features of the embodiments may be combined with each other without conflict.
The application will be described in detail below with reference to the drawings in connection with embodiments.
Referring to fig. 1, a geological disaster monitoring method based on unmanned aerial vehicle video stream analysis includes the steps of:
Step 1: arranging the unmanned aerial vehicle in a corresponding airspace of a monitoring area, aligning a camera of the unmanned aerial vehicle to the area to be monitored, and collecting video data of the area to be detected in real time.
Step 1 comprises the following steps:
step 11: and determining a mountain range to be monitored according to the requirements, and taking the mountain range as a target area.
The mountain range is herein an area to be monitored, such as some desired mountain range, e.g. a cliff near one side of the road. Because the early warning of landslide needs to be sent in time, the road is quickly dredged. Or the rock walls on two sides of the river, thereby avoiding the formation of barrier lakes due to landslides. In practice, the side walls of a mountain that need to be monitored are generally defined as the target area.
Step 12: and (3) flying the unmanned aerial vehicle to a contralateral airspace of the target area, aligning a camera of the unmanned aerial vehicle to the target area, and adjusting the distance between the unmanned aerial vehicle and the target area and the focal length of the camera to ensure that the shooting range of the camera is larger than the mountain range to be monitored.
In the field, many trees can obstruct the field of view, so unmanned aerial vehicle is required to fly to a certain height, such as a position higher than the tree, or the position of the tree tip, and then the unmanned aerial vehicle is used for aligning with the mountain to be monitored. In practice, the camera is the best choice for facing the mountain to be monitored, so that the unmanned aerial vehicle needs to be located in the opposite-side airspace of the target area.
The shooting range of the camera is larger than the mountain range to be detected, and the focal length of the camera or the distance between the camera and the mountain is actually controlled. Specifically, the closer the camera is to the mountain, the clearer the photograph is, but the smaller the field of view that can be obtained, that is, the smaller the shooting range is. The further the camera is away from the mountain, the more blurred the picture is, but the larger the shooting range becomes. The focal length of the camera can be understood as the magnification of the mobile phone camera, and the larger the magnification of the mobile phone camera is, the more blurred the picture is, but the smaller the shooting range is.
In the scheme, the target area can be located in the whole image by controlling the focal length and the distance.
Step 13: video data of the area to be detected is collected in real time.
The camera can continuously shoot the target area, so that continuous video data can be collected. In practice, the video data may be adjusted to 24 frames per second, or 30 frames per second. The specific frame number setting is determined by the computing power of the associated hardware. But at a minimum it is necessary to ensure 24 frames of images for 1 second.
Further, in step 1, the unmanned aerial vehicle is connected with a control center on the ground through a cable, so as to complete information transmission and power transmission.
Unmanned aerial vehicles generally carry a mobile power supply, perform flight tasks in the air, and then communicate with the ground through wireless signals of a specific frequency band. But this way the connection to the ground is affected by the power of the mobile power supply. Therefore, in the scheme, the unmanned aerial vehicle is connected with the control center on the ground by adopting the cable. The technology is also a mature technology at present, and belongs to a tethered unmanned aerial vehicle. Therefore, how to design the cable connected between the unmanned aerial vehicle and the ground is not further described herein.
Step 2: the video data is resolved into continuous image frames according to time sequence, and each frame of image is subjected to gray processing and converted into gray images.
Step 2 comprises the following steps:
step 21: a plurality of mark points are preset in the picture range of the area to be monitored.
The arrangement scheme of the marking points is various, and in fact, the marking points are designed to enable each picture with a monitoring area to be overlapped with each other. Here, the following two schemes of setting the mark points are provided.
Scheme 1: the actual marking point is set, and a large distance exists between the camera of the unmanned aerial vehicle and the required monitoring mountain, and a plurality of non-mountain structures exist at the distance, as shown in fig. 2. Such as trees in front of a mountain or some special man-made structure. In this way, special colors can be painted on these trees or on the building as marking points. When the camera shoots and monitors the mountain, the mark points are necessarily shot.
Scheme 2: virtual mark points are set. And searching special patterns on the picture of the detection area, and then carrying out artificial intelligent recognition. For example, some of the branches in fig. 3. The picture recognition technique here will not be further described here.
Scheme 1 is virtually identical to scheme 2, except that scheme 1 is artificially provided with more easily identifiable marker points.
Step 22: and analyzing the video data into continuous image frames, intercepting each image frame, and removing the image information of the non-monitoring area.
The range of the monitoring area is set in advance. The profile features of the monitored area can be directly identified by manual identification techniques. After the corresponding image frames are collected, the outline of the monitoring area is extracted from the image frames, and then the image information outside the outline is removed. An image is formed with the outline of the monitored area as an edge.
Step 23: successive image frames are aligned according to at least one third of the mark points as a criterion.
Each image may have a certain movement due to the deflection of the drone. Here the mark points are made coincident, in effect realigning the off-centered images. Since the mark point is not provided on the mountain where the landslide is possible, the mark point is not caused to move even if the landslide occurs.
Step 24: and carrying out gray scale processing on the continuous image frames to obtain continuous gray scale images.
Step 2 further comprises the steps of:
Step 25: the resolution of successive gray scale images is reduced to a predetermined size.
Step 25 comprises the steps of:
Step 251: fourier transforming each image f 1 (x, y) of the successive gray scale images;
;
Where F (n, m) is the transformed spectrum, n and m are the abscissa and ordinate in frequency, x and y are coordinates in the spatial domain, and F 1 (x, y) represents one of the successive gray images; m and N respectively represent the width and the height of the image in the spatial domain; Is a kernel function of fourier transform, j is an imaginary unit;
Step 252: removing high-frequency components in a frequency spectrum F (n, m) by adopting a low-pass filter;
G(n,m)=H(n,m)×F(n,m);
Wherein H (n, m) is a transfer function; if D (n, m) is less than or equal to D 0, H (n, m) =1; if D (n, m) > D 0, H (n, m) =0;
wherein D 0 is cut-off frequency, and is set by reducing resolution as required; d (n, m) is the distance from the point (n, m) to the center of the spectrum, G (n, m) represents the spectrum after the low pass filter processing;
step 253 includes the steps of:
performing inverse Fourier operation on the filtered spectrum to obtain an image g (x, y) with a reduced distribution rate;
g (x, y) represents the reduced resolution image obtained after the inverse fourier transform, and the coordinates in the spatial domain are x and y;
Is the normalized coefficient of the inverse fourier transform;
g (n, m) represents the spectrum after processing by the low pass filter for inverse fourier transformation.
In practical use, a required map correlation resolution of 400 is required.
Step 3: and sequentially calculating the motion speed and the motion direction of the pixel points between two adjacent frames based on an optical flow calculation method.
Step3 comprises the following steps:
Step 31: successive gray scale images are collected in real time.
Step 32: comparing the received latest gray image with the received previous gray image, and calculating the light flux of each pixel point to obtain an optical flow distribution picture.
Further, in step 32, the optical flow distribution is calculated as follows: f (x, y) = (u (x, y), v (x, y)); x is the abscissa of the gray image, y is the ordinate of the gray image, f (x, y) is the optical flow vector at the position (x, y), u (x, y) and v (x, y) are the displacement amounts of the position (x, y) in the horizontal and vertical directions, respectively.
The scheme can smoothly and accurately complete the estimation work of the optical flow field.
Step 4: and obtaining a motion amplitude distribution diagram of the target area according to the motion speed and the motion direction of the pixel points between two adjacent frames.
Step 4 comprises the following steps:
Step 41: collecting optical flow distribution pictures according to the sequence;
step 42: for each optical flow distribution picture, carrying out local motion change analysis, calculating through double partial derivatives to obtain the change quantity of each pixel point, and describing the change quantity by using a heat value to obtain a motion amplitude distribution diagram of the motion amplitude of each position of a monitoring area; the calculation method is as follows:
d 2u/dx2 and d 2v/dy2 represent the double partial derivatives in the horizontal and vertical directions, respectively, and u represents the displacement of a point in the image in the horizontal direction. In the optical flow vector (u, v), u describes the distance of movement of the same object or pixel in the horizontal direction from one frame to another between two adjacent frames in the image sequence. v denotes the displacement of a point in the image in the vertical direction. Likewise, in the optical flow vector (u, v), v describes the distance of movement of the same object or pixel point in the vertical direction from one frame to another between two adjacent frames in the image sequence. d is a differential operator representing the calculation of the derivative or rate of change. In this formula, d 2u/dx2 and d 2v/dy2 represent the second partial derivatives of u and v with respect to x and y, respectively. d 2u/dx2 denotes the rate of change of the horizontal displacement u with respect to the rate of change of the horizontal direction x, i.e., the curvature or acceleration in the horizontal direction of the horizontal displacement. d 2v/dy2 denotes the rate of change of the vertical displacement v with respect to the rate of change of the vertical direction y, i.e., the curvature or acceleration in the vertical direction of the vertical displacement. magnitude represents the motion amplitude, which is a quantity synthesized from displacement accelerations (second derivatives) in the horizontal and vertical directions. This quantity reflects the intensity of the local motion change of each point in the image, and can be used to generate a heat map that visually shows the area and intensity of the surface motion. In the context of geological disaster monitoring, a larger magnitide value may indicate a potential landslide or ground movement activity.
Referring to fig. 3 to 5, step 5: and monitoring whether landslide occurs in the target area according to the motion amplitude distribution diagram of the target area, and giving out early warning when the landslide occurs.
The step 5 is specifically as follows: the heat threshold value is preset, each generated motion amplitude distribution diagram is monitored in real time, and when the heat value in the motion amplitude distribution diagram exceeds the heat threshold value, an early warning signal is sent out.
The above description is only illustrative of the few preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application in the embodiments of the present application is not limited to the specific combination of the above technical features, but also encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the application. Such as the above-described features, are mutually replaced with the technical features having similar functions (but not limited to) disclosed in the embodiments of the present application.

Claims (9)

1. The geological disaster monitoring method based on unmanned aerial vehicle video stream analysis is characterized by comprising the following steps of:
step 1: arranging the unmanned aerial vehicle in a corresponding airspace of a monitoring area, aligning a camera of the unmanned aerial vehicle to the area to be monitored, and collecting video data of the area to be detected in real time;
Step 2: analyzing the video data into continuous image frames according to time sequence, carrying out gray processing on each frame of image, and converting the gray processing into gray images;
step 3: sequentially calculating the motion speed and the motion direction of pixel points between two adjacent frames based on an optical flow calculation method;
step 4: obtaining a motion amplitude distribution diagram of a target area according to the motion speed and the motion direction of the pixel points between two adjacent frames;
step 5: monitoring whether landslide occurs in the target area according to a motion amplitude distribution diagram of the target area, and giving an early warning when the landslide occurs;
step 4 comprises the following steps:
Step 41: collecting optical flow distribution pictures according to the sequence;
step 42: for each optical flow distribution picture, carrying out local motion change analysis, calculating through double partial derivatives to obtain the change quantity of each pixel point, and describing the change quantity by using a heat value to obtain a motion amplitude distribution diagram of the motion amplitude of each position of a monitoring area; the calculation method is as follows:
Representing the double partial derivatives in the horizontal and vertical directions, respectively, magnitude represents the motion amplitude, which is the quantity synthesized by displacement acceleration in the horizontal and vertical directions, u represents the displacement of a certain point in the image in the horizontal direction, v represents the displacement of a certain point in the image in the vertical direction, d is a differential operator, represents the calculation of the derivative or the change rate, x represents the horizontal coordinate of the image, which represents the position of the pixel point in the horizontal direction, y represents the vertical coordinate of the image, which represents the position of the pixel point in the vertical direction.
2. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 1, wherein the method comprises the following steps: step 1 comprises the following steps:
Step 11: determining a mountain range to be monitored according to requirements, and taking the mountain range as a target area;
Step 12: the unmanned aerial vehicle is flown to a contralateral airspace of a target area, a camera of the unmanned aerial vehicle is aligned to the target area, and the distance between the unmanned aerial vehicle and the target area and the focal length of the camera are adjusted, so that the shooting range of the camera is larger than the mountain range to be monitored;
Step 13: video data of the area to be detected is collected in real time.
3. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 1, wherein the method comprises the following steps: in step 1, unmanned aerial vehicle passes through the control center connection of cable with ground to accomplish information transmission and power transmission.
4. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 1, wherein the method comprises the following steps: step 2 comprises the following steps:
step 21: a plurality of mark points are preset in a picture range of a region to be monitored;
Step 22: analyzing the video data into continuous image frames, intercepting each image frame, and removing the image information of a non-monitoring area;
Step 23: aligning the continuous image frames according to the criterion of overlapping at least one third of marking points;
Step 24: and carrying out gray scale processing on the continuous image frames to obtain continuous gray scale images.
5. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 4, wherein the method comprises the following steps: step2 further comprises the steps of:
Step 25: the resolution of successive gray scale images is reduced to a predetermined size.
6. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 5, wherein the method comprises the following steps: step25 comprises the steps of:
Step 251: fourier transforming each image f 1 (x, y) of the successive gray scale images;
;
Where F (n, m) is the transformed spectrum, n and m are the abscissa and ordinate in frequency, x and y are coordinates in the spatial domain, and F 1 (x, y) represents one of the successive gray images; m and N respectively represent the width and the height of the image in the spatial domain; Is a kernel function of fourier transform, j is an imaginary unit;
Step 252: removing high-frequency components in a frequency spectrum F (n, m) by adopting a low-pass filter;
G(n,m)=H(n,m)×F(n,m);
Wherein H (n, m) is a transfer function; if D (n, m) is less than or equal to D 0, H (n, m) =1; if D (n, m) > D 0, H (n, m) =0;
wherein D 0 is cut-off frequency, and is set by reducing resolution as required; d (n, m) is the distance from the point (n, m) to the center of the spectrum, G (n, m) represents the spectrum after the low pass filter processing;
step 253 includes the steps of:
performing inverse Fourier operation on the filtered spectrum to obtain an image g (x, y) with a reduced distribution rate;
g (x, y) represents the reduced resolution image obtained after the inverse fourier transform, and the coordinates in the spatial domain are x and y;
Is the normalized coefficient of the inverse fourier transform;
g (n, m) represents the spectrum after processing by the low pass filter for inverse fourier transformation.
7. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 6, wherein: step 3 comprises the following steps:
Step 31: collecting continuous gray images in real time;
Step 32: comparing the received latest gray image with the received previous gray image, and calculating the light flux of each pixel point to obtain an optical flow distribution picture.
8. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 7, wherein: in step 32, the optical flow distribution is calculated as follows:
f (x, y) = (u (x, y), v (x, y)); x is the abscissa of the gray image, y is the ordinate of the gray image, f (x, y) is the optical flow vector at the position (x, y), u (x, y) and v (x, y) are the displacement amounts of the position (x, y) in the horizontal and vertical directions, respectively.
9. The method for monitoring geological disasters based on unmanned aerial vehicle video stream analysis according to claim 1, wherein the method comprises the following steps: the step 5 is specifically as follows: the heat threshold value is preset, each generated motion amplitude distribution diagram is monitored in real time, and when the heat value in the motion amplitude distribution diagram exceeds the heat threshold value, an early warning signal is sent out.
CN202410250306.6A 2024-03-05 Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis Active CN117854256B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410250306.6A CN117854256B (en) 2024-03-05 Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410250306.6A CN117854256B (en) 2024-03-05 Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis

Publications (2)

Publication Number Publication Date
CN117854256A CN117854256A (en) 2024-04-09
CN117854256B true CN117854256B (en) 2024-06-11

Family

ID=

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN86101876A (en) * 1985-03-25 1986-09-24 美国无线电公司 Handle the filtering system that reduces resolution video image
CN108119764A (en) * 2017-12-26 2018-06-05 东莞理工学院 Time reversal adaptive mesh gas pipeline leak hunting method
CN111006593A (en) * 2019-12-13 2020-04-14 武汉纵横天地空间信息技术有限公司 Method and system for monitoring mountain landform and predicting landslide by using unmanned aerial vehicle
CN112488177A (en) * 2020-11-26 2021-03-12 金蝶软件(中国)有限公司 Image matching method and related equipment
CN113673454A (en) * 2021-08-26 2021-11-19 北京声智科技有限公司 Remnant detection method, related device, and storage medium
CN115035182A (en) * 2022-06-06 2022-09-09 桂林理工大学 Landslide disaster early warning method and system in mountainous area
CN115762063A (en) * 2022-11-12 2023-03-07 高精特(成都)大数据科技有限公司 Debris flow early warning method, device, system and medium based on image and radar
CN115950435A (en) * 2023-02-20 2023-04-11 黄河勘测规划设计研究院有限公司 Real-time positioning method for unmanned aerial vehicle inspection image
CN116343436A (en) * 2022-12-28 2023-06-27 浙江大华技术股份有限公司 Landslide detection method, landslide detection device, landslide detection equipment and landslide detection medium
CN116363835A (en) * 2023-03-30 2023-06-30 湖南科技大学 Geological disaster induced landslide monitoring device
CN116612609A (en) * 2023-07-21 2023-08-18 湖北通达数科科技有限公司 Disaster early warning method and system based on landslide hazard prediction
CN116631158A (en) * 2023-06-12 2023-08-22 云南电网有限责任公司昭通供电局 Landslide early warning device based on image recognition and computer vision
CN117152463A (en) * 2023-08-11 2023-12-01 智洋创新科技股份有限公司 Debris flow faucet monitoring method based on video analysis
CN117237597A (en) * 2023-08-30 2023-12-15 福信富通科技股份有限公司 Data processing terminal based on Beidou satellite data and AI graph fusion
CN117523787A (en) * 2023-11-24 2024-02-06 广西旅发科技有限公司 Landslide safety monitoring system and early warning method thereof
CN117553737A (en) * 2023-11-13 2024-02-13 内蒙古高新科技控股有限责任公司 Slope safety monitoring method and device based on lead telemetry information
CN117635649A (en) * 2023-11-29 2024-03-01 长安大学 Landslide monitoring method and system

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN86101876A (en) * 1985-03-25 1986-09-24 美国无线电公司 Handle the filtering system that reduces resolution video image
CN108119764A (en) * 2017-12-26 2018-06-05 东莞理工学院 Time reversal adaptive mesh gas pipeline leak hunting method
CN111006593A (en) * 2019-12-13 2020-04-14 武汉纵横天地空间信息技术有限公司 Method and system for monitoring mountain landform and predicting landslide by using unmanned aerial vehicle
CN112488177A (en) * 2020-11-26 2021-03-12 金蝶软件(中国)有限公司 Image matching method and related equipment
CN113673454A (en) * 2021-08-26 2021-11-19 北京声智科技有限公司 Remnant detection method, related device, and storage medium
CN115035182A (en) * 2022-06-06 2022-09-09 桂林理工大学 Landslide disaster early warning method and system in mountainous area
CN115762063A (en) * 2022-11-12 2023-03-07 高精特(成都)大数据科技有限公司 Debris flow early warning method, device, system and medium based on image and radar
CN116343436A (en) * 2022-12-28 2023-06-27 浙江大华技术股份有限公司 Landslide detection method, landslide detection device, landslide detection equipment and landslide detection medium
CN115950435A (en) * 2023-02-20 2023-04-11 黄河勘测规划设计研究院有限公司 Real-time positioning method for unmanned aerial vehicle inspection image
CN116363835A (en) * 2023-03-30 2023-06-30 湖南科技大学 Geological disaster induced landslide monitoring device
CN116631158A (en) * 2023-06-12 2023-08-22 云南电网有限责任公司昭通供电局 Landslide early warning device based on image recognition and computer vision
CN116612609A (en) * 2023-07-21 2023-08-18 湖北通达数科科技有限公司 Disaster early warning method and system based on landslide hazard prediction
CN117152463A (en) * 2023-08-11 2023-12-01 智洋创新科技股份有限公司 Debris flow faucet monitoring method based on video analysis
CN117237597A (en) * 2023-08-30 2023-12-15 福信富通科技股份有限公司 Data processing terminal based on Beidou satellite data and AI graph fusion
CN117553737A (en) * 2023-11-13 2024-02-13 内蒙古高新科技控股有限责任公司 Slope safety monitoring method and device based on lead telemetry information
CN117523787A (en) * 2023-11-24 2024-02-06 广西旅发科技有限公司 Landslide safety monitoring system and early warning method thereof
CN117635649A (en) * 2023-11-29 2024-03-01 长安大学 Landslide monitoring method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
InSAR应用于地质灾害早期识别的研究现状及趋势;杨曦璃;陈谦;任光明;吕学海;乔云池;;人民长江;20191228(第S2期);全文 *
一种新颖的低分辨率条件下的掌纹识别方法;苑玮琦;赵伟鹏;桑海峰;;计算机应用研究;20081015(第10期);全文 *
三维姿态监测技术在地震测井中的应用;蒋鑫;庹先国;毛小波;;中国测试;20150331(第03期);全文 *
无人机非接触式高边坡表面位移监测方法;李竹有;杨国华;孙洪稳;浦瑞;樊贵明;;云南水力发电;20191115(第S2期);全文 *

Similar Documents

Publication Publication Date Title
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
KR101105795B1 (en) Automatic processing of aerial images
CN106356757B (en) A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic
KR100377067B1 (en) Method and apparatus for detecting object movement within an image sequence
CN111241988B (en) Method for detecting and identifying moving target in large scene by combining positioning information
CN103778645B (en) Circular target real-time tracking method based on images
CN105957109A (en) Target tracking method and device
CN105549614A (en) Target tracking method of unmanned plane
CN114038193B (en) Intelligent traffic flow data statistics method and system based on unmanned aerial vehicle and multi-target tracking
CN115661204B (en) Collaborative searching and tracking positioning method for moving target by unmanned aerial vehicle cluster
CN112215860A (en) Unmanned aerial vehicle positioning method based on image processing
CN108108697A (en) A kind of real-time UAV Video object detecting and tracking method
CN109483507B (en) Indoor visual positioning method for walking of multiple wheeled robots
CN115346368B (en) Traffic road side sensing system and method based on integrated fusion of far-view and near-view multiple sensors
CN110189363A (en) A kind of low multi-view video speed-measuring method of the mobile target of airdrome scene
CN114034296A (en) Navigation signal interference source detection and identification method and system
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
CN110824495B (en) Laser radar-based drosophila visual inspired three-dimensional moving target detection method
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN109765931B (en) Near-infrared video automatic navigation method suitable for breakwater inspection unmanned aerial vehicle
CN117854256B (en) Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis
CN110989645B (en) Target space attitude processing method based on compound eye imaging principle
CN116188334B (en) Automatic repair method and device for lane line point cloud
CN117854256A (en) Geological disaster monitoring method based on unmanned aerial vehicle video stream analysis
CN111311640A (en) Unmanned aerial vehicle identification and tracking method based on motion estimation

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant