CN115410095B - Disaster information acquisition method and device, electronic equipment and computer readable medium - Google Patents

Disaster information acquisition method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN115410095B
CN115410095B CN202211341353.9A CN202211341353A CN115410095B CN 115410095 B CN115410095 B CN 115410095B CN 202211341353 A CN202211341353 A CN 202211341353A CN 115410095 B CN115410095 B CN 115410095B
Authority
CN
China
Prior art keywords
image
disaster
geological
area
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211341353.9A
Other languages
Chinese (zh)
Other versions
CN115410095A (en
Inventor
闫烨琛
席雪萍
于向吉
高学飞
罗福贵
袁婷婷
齐超
李亚平
杨斌
李强
赵炳群
李军
高鹏
韩慧慧
聂黎楠
杨问省
孙振营
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Geological Engineering Survey And Design Institute Co ltd
Original Assignee
Tianjin Geological Engineering Survey And Design Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Geological Engineering Survey And Design Institute Co ltd filed Critical Tianjin Geological Engineering Survey And Design Institute Co ltd
Priority to CN202211341353.9A priority Critical patent/CN115410095B/en
Publication of CN115410095A publication Critical patent/CN115410095A/en
Application granted granted Critical
Publication of CN115410095B publication Critical patent/CN115410095B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the disclosure discloses a disaster information acquisition method, a disaster information acquisition device, electronic equipment and a computer readable medium. One embodiment of the method comprises: responding to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determining at least one large-scale disaster area to be processed from the remote sensing image sequence; in response to the existence of a real-scene image containing small-scale disaster features in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one real-scene image; and monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area. The implementation mode realizes early identification and monitoring of the disaster, and improves effectiveness and effectiveness of disaster monitoring.

Description

Disaster information acquisition method and device, electronic equipment and computer readable medium
Technical Field
The embodiment of the disclosure relates to the technical field of geological monitoring, in particular to a disaster information acquisition method, a disaster information acquisition device, electronic equipment and a computer readable medium.
Background
China has complex and various terrains, and the area of the mountainous region accounts for 2/3 of the total area of the country. In addition, the method has wide plateaus, basins and the like, the landforms are complete in variety, the geological structure is complex, the number of sudden geological disasters is large, and a large amount of personnel and property loss is caused. For example, the area is a low mountain and hilly area with erosion and degradation and the altitude is 40-800 meters. The method belongs to continental monsoon climate areas, the rainy season is 6-9 months, the rainfall is large and concentrated and accounts for more than 80% of the annual rainfall, and the main flood season is 7-8 months and is also a multiple period of geological disasters. Due to special geological and geomorphic conditions and abnormal rainfall in flood season, the mountain area is an area where sudden geological disasters easily occur and is also a key area for preventing and treating the sudden geological disasters. Disasters usually occur in large areas, the types of which are mainly collapse, landslide and debris flow. Under the combined action of natural factors and human activities, sudden geological disasters such as collapse, landslide and debris flow appear in the area to different degrees, and some geological disasters are developed into important geological disasters which are widely distributed and have large influence. However, the disaster is usually monitored on site in a short time before and after the disaster occurs, so that the disaster is not easy to predict and accurately judge in advance, and the effectiveness of disaster monitoring are not high.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a disaster information acquisition method, apparatus, electronic device and computer readable medium to solve the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a disaster information obtaining method, including: responding to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determining at least one large-scale disaster area to be processed from the remote sensing image sequence; in response to the existence of a real-scene image containing small-scale disaster features in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one real-scene image; and monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area.
In a second aspect, some embodiments of the present disclosure provide a disaster information acquisition apparatus, including: the disaster area processing method comprises a first disaster area determining unit, a second disaster area determining unit and a third disaster area determining unit, wherein the first disaster area determining unit is configured to respond to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determine at least one large-scale disaster area to be processed from the remote sensing image sequence; a second disaster area determination unit configured to determine at least one small-scale disaster area to be processed based on at least one live-action image corresponding to the at least one large-scale disaster area in response to the presence of the live-action image containing a small-scale disaster feature in the at least one live-action image; and the disaster information determining unit is configured to monitor the at least one small-scale disaster area to be processed in real time and determine disaster information of the target area.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following beneficial effects: the disaster information obtained by the disaster information obtaining method of some embodiments of the present disclosure improves the effectiveness and effectiveness of disaster monitoring. Specifically, the reason why the effectiveness and the effectiveness of disaster monitoring are not high is that: the existing disaster monitoring method can monitor the disaster only in a short time before and after the disaster, and is not easy to realize the advance prediction and accurate judgment of the disaster. Based on this, the disaster information acquisition method of some embodiments of the present disclosure first acquires a remote sensing image sequence of a target region, and determines at least one large-scale disaster region to be processed from the remote sensing image sequence. The remote sensing image is obtained through a remote sensing satellite, can contain geological information of a target area in a large range, and is beneficial to judging possible disasters from a large-scale angle. Then, at least one live-action image corresponding to the at least one large-scale disaster area to be processed is obtained, and at least one small-scale disaster area to be processed is determined based on the at least one live-action image, so that further judgment of a possible disaster area is realized. And finally, monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area. The early identification and monitoring of the disaster are realized, and the effectiveness of disaster monitoring are improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a schematic view of an application scenario of a disaster information acquisition method according to some embodiments of the present disclosure;
fig. 2 is a flow diagram of some embodiments of a disaster information acquisition method according to the present disclosure;
FIG. 3 is a flow diagram of further embodiments of a disaster information acquisition method according to the present disclosure;
FIG. 4 is a flow chart of still further embodiments of disaster information acquisition methods according to the present disclosure;
FIG. 5 is a schematic block diagram of some embodiments of disaster information acquisition devices according to the present disclosure;
FIG. 6 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and the embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will appreciate that references to "one or more" are intended to be exemplary and not limiting unless the context clearly indicates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of an application scenario of a disaster information acquisition method according to some embodiments of the present disclosure.
As shown in fig. 1, a remote sensing satellite 102 may acquire a sequence of remote sensing images of a target area 105 and transmit the sequence of remote sensing images to a server 101. Disasters usually involve large areas, and therefore, the disaster characteristics of large areas of subsurface surfaces can be monitored primarily by remote sensing images. The server 101 may analyze the sequence of remote sensing images and determine at least one large-scale disaster area to be processed from the sequence of remote sensing images, i.e. the large-scale disaster area to be processed is determined by the remote sensing satellite 102. Then, the unmanned aerial vehicle 103 may further obtain at least one live-action image corresponding to the large-scale disaster area to be processed, and after obtaining the at least one live-action image, the server 101 determines at least one small-scale disaster area to be processed based on the at least one live-action image, that is, the small-scale disaster area to be processed is determined by the unmanned aerial vehicle 103. Finally, the server 101 may control the measuring tower 104 corresponding to the target area 105 to acquire disaster information of the site of the target area 105. Therefore, by monitoring the disaster under multiple scales, the early identification and monitoring of the disaster are realized, and the effectiveness of disaster monitoring are improved.
It should be understood that the number of servers 101, remote sensing satellites 102, drones 103, measurement towers 104, and target areas 105 in fig. 1 are merely illustrative. There may be any number of servers 101, remote sensing satellites 102, drones 103, measurement towers 104, and target areas 105, as desired for an implementation.
With continued reference to fig. 2, fig. 2 illustrates a flow 200 of some embodiments of a disaster information acquisition method according to the present disclosure. The disaster information acquisition method comprises the following steps:
step 201, responding to the existence of remote sensing images containing large-scale disaster features in the remote sensing image sequence of the target area, and determining at least one large-scale disaster area to be processed from the remote sensing image sequence.
In some embodiments, the subject of the disaster information acquisition method (e.g., server 101 shown in fig. 1) may receive the sequence of telemetry images of target area 105 from telemetry satellite 102 via a wired connection or a wireless connection. The remote sensing image sequence may include a plurality of images of the target area 105 captured by the remote sensing satellite 102 at intervals of a set time, and the characteristic information of the target area 105 may be acquired from a plurality of angles in the high altitude. The execution subject may analyze the remote sensing image sequence, and when finding that a region including a large-scale disaster feature (for example, a geological structure abnormality, a large-area displacement, or the like) exists in the remote sensing image, may determine a large-scale disaster region to be processed corresponding to the target region 105 in the remote sensing image. Here, the large-scale disaster area to be processed is an image corresponding to the disaster area in the remote sensing image.
Step 202, in response to the existence of a real-scene image containing a small-scale disaster feature in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one real-scene image.
In some embodiments, when a large-scale disaster area to be treated is determined from the sequence of remotely sensed images, it is indicated that a disaster may exist in target area 105. The executive body can further acquire live-action images of the large-scale disaster area to be processed acquired by the unmanned aerial vehicle 103. Wherein, the live-action image is acquired by the unmanned aerial vehicle 103. Therefore, the live-action image can reflect the characteristic information of the possible disaster area from the low altitude. After the execution subject acquires the live-action image, the execution subject may further analyze the live-action image, and when a small-scale disaster feature (for example, a color feature or a texture feature) exists in the live-action image, the execution subject may further determine a small-scale disaster area to be processed from the live-action image. Here, the small-scale disaster area to be processed is an image corresponding to the disaster area in the live-action image.
And 203, monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area.
In some embodiments, after determining the small-scale disaster area to be processed, the execution subject may collect real-time information (which may be, for example, topographic information, rainfall information, etc.) of the small-scale disaster area to be processed through the measurement tower 104 disposed near the small-scale disaster area to be processed, and finally determine disaster information of the target area 105 according to the real-time information.
According to the disaster information obtaining method disclosed by some embodiments of the disclosure, the effectiveness and the effectiveness of disaster monitoring are improved. Specifically, the reasons why the effectiveness and the effectiveness of disaster monitoring are not high are that: the existing disaster monitoring method can monitor the disaster only in a short time before and after the disaster, and is not easy to realize the advance prediction and accurate judgment of the disaster. Based on this, the disaster information acquisition method of some embodiments of the present disclosure first acquires a remote sensing image sequence of a target region, and determines at least one large-scale disaster region to be processed from the remote sensing image sequence. The remote sensing image is obtained through a remote sensing satellite, can contain geological information of a target area in a large range, and is beneficial to judging possible disasters from a large-scale angle. Then, at least one live-action image corresponding to the at least one large-scale disaster area to be processed is obtained, and at least one small-scale disaster area to be processed is determined based on the at least one live-action image, so that further judgment of a possible disaster area is realized. And finally, monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area. The early identification and monitoring of the disaster are realized, and the effectiveness and effectiveness of disaster monitoring are improved.
With further reference to fig. 3, a flow 300 of further embodiments of a disaster information acquisition method is shown. The process 300 of the disaster information acquisition method includes the following steps:
step 301, in response to the existence of a remote sensing image containing large-scale disaster features in a remote sensing image sequence of a target area, obtaining an initial image to be processed based on the remote sensing image sequence.
In some embodiments, the execution subject may process the sequence of remote sensing images to obtain an initial image to be processed.
In some optional implementations of some embodiments, the obtaining an initial to-be-processed image based on the remote sensing image sequence may include:
the method comprises the steps of firstly, correcting each remote sensing image in the remote sensing image sequence through a preset correction operation to obtain a corrected image sequence corresponding to the remote sensing image sequence.
The preset correction operation includes at least one of: data source selection, radiation correction, geometric correction. Specifically, the execution subject needs to comprehensively consider various factors such as the spatial range of the investigation working area, the regional environment, the characteristics of the interpretation target, the economic cost of data collection and the like in the selection stage of the remote sensing image data source. And determining the parameter characteristics of the original remote sensing image data in spatial resolution, wave-normal resolution, time resolution and the like according to the requirements, and further determining a remote sensing data source.
The purpose of radiation correction is to eliminate or correct image distortion due to radiation errors, i.e., radiation errors. The radiation error refers to an error caused by inconsistency between a detection value of the remote sensing sensor and an actual spectral radiation value of a ground object due to influences of factors such as characteristics of the remote sensing sensor, atmospheric effects, illumination conditions of the ground object (terrain influence and solar altitude angle influence) and the like when the sensor receives electromagnetic wave radiation energy from the ground object. Specifically, the energy received by the optical lens of the remote sensing satellite is divided into 3 parts (the energy of the received light is recorded and given to a corresponding gray value by means of photoelectric conversion, that is, the energy of the light is closely related to the gray value), and the energy of the light is mainly reflected light of a target object, but also reflected light of a ground object adjacent to the target object and energy scattered by the atmosphere. Therefore, the measured value of the remote sensing sensor is inconsistent with the actual spectral radiance of the ground object, so that the radiation error of the sensor can be caused. In summary, the radiation calibration generally includes 3 aspects of sensor calibration, atmospheric calibration, terrain and solar altitude calibration.
a. Sensor calibration
The sensor correction (radiometric calibration) can eliminate the radiometric error caused by the sensor itself and convert the dimensionless DN value recorded by the sensor into the atmospheric top layer radiance or reflectivity with practical physical meaning.
Radiometric calibration may establish a mathematical functional relationship between the DN value and the actual radiance value in order to obtain the absolute value of the radiance of the target (radiance of the top layer of the atmosphere). The calculation formula is as follows:
Figure 117748DEST_PATH_IMAGE001
wherein:
Figure 503730DEST_PATH_IMAGE002
is a wave band
Figure 549047DEST_PATH_IMAGE003
The radiance value of (a);
Figure 413098DEST_PATH_IMAGE004
is the original luminance value;
Figure 696311DEST_PATH_IMAGE005
and
Figure 472769DEST_PATH_IMAGE006
respectively gain and offset.
The above equation enables the DN value to be converted to radiance.
The relationship between the reflectivity and the radiance of the top layer of the atmosphere is calculated as follows:
Figure 208643DEST_PATH_IMAGE007
wherein:
Figure 673123DEST_PATH_IMAGE008
is the reflectivity;
Figure 810843DEST_PATH_IMAGE009
is the distance of day and earth in astronomical units;
Figure 7469DEST_PATH_IMAGE010
is a wave band
Figure 479908DEST_PATH_IMAGE003
(iii) apparent solar spectral irradiance of;
Figure 482499DEST_PATH_IMAGE011
is the zenith angle of the sun.
b. Atmospheric correction
The sensor calibration only yields a more accurate level of above-atmosphere radiance, but the reflection feature is the level of solar radiance reflected from the earth's surface. The brightness of the solar radiation reflected by the earth's surface must change as it passes through the atmosphere. Atmospheric corrections are required to obtain more accurate values of surface reflected solar radiation brightness. Atmospheric correction is mainly used for eliminating the influence of atmospheric absorption and atmospheric scattering on radiation transmission.
According to the correction principle, atmospheric correction is divided into two types, namely a statistical model and a physical model.
A statistical model: the statistical model is established based on the correlation between the surface variables and the remote sensing data, and the atmosphere and the geometric conditions of the image acquisition are not required to be known. The method has the advantages of simplicity, feasibility and less required parameters, but the statistical model is only suitable for local regions due to difference among the regions and has no universality.
Physical model: the physical model is established according to the physical laws of the remote sensing system, and the model can be improved by continuously adding new knowledge and information. The required parameters are many and often difficult to obtain, and the model is complex.
c. Terrain and sun altitude correction
The purpose of terrain correction is to eliminate radiance errors caused by terrain, so that terrain features of different slopes but the same reflective properties have the same brightness values in the image. The terrain correction can mainly reduce the shadow part in the remote sensing image. The terrain correction is usually performed using a cosine correction method, a semi-empirical coefficient C. The correction formula is as follows:
Figure 474726DEST_PATH_IMAGE012
wherein:
Figure 107832DEST_PATH_IMAGE013
is the equivalent observed value (i.e., corrected value) of the pixel on the horizontal plane;
Figure 818299DEST_PATH_IMAGE014
is the equivalent observed value (namely the original pixel value) of the pixel on the inclined plane;
Figure 359002DEST_PATH_IMAGE015
is the angle of incidence of the sun;
Figure 219117DEST_PATH_IMAGE016
are semi-empirical coefficients.
The purpose of solar altitude correction is to eliminate the radiation difference between images in different seasons and different periods in different places by correcting the image acquired when the solar rays are obliquely irradiated to the image acquired when the solar rays are vertically irradiated, and mainly comparing the images at different solar altitude. All remote sensing images are changed into images formed by the vertical incidence of sunlight after being corrected by the solar elevation angle. The basic correction formula is as follows:
Figure 757546DEST_PATH_IMAGE017
wherein:
Figure 955309DEST_PATH_IMAGE018
is the corrected luminance value;
Figure 299703DEST_PATH_IMAGE019
is the solar altitude.
The remote sensing image does not necessarily need to be corrected for the solar altitude, and the solar altitude correction is only necessary when a time series of remote sensing images is observed.
In the remote sensing imaging process, due to factors inside the sensor, factors of a remote sensing platform, factors of the earth and the like, the problems of extrusion, stretching, distortion, deviation and the like of the actual position of an image pixel generated by the sensor relative to a ground target object are caused, namely, geometric distortion is generated.
The geometric distortion brings errors to the processes of quantitative analysis, change detection, image fusion, map measurement or update and the like based on the remote sensing image, so that the geometric distortion of the image needs to be corrected, namely geometric correction. The geometric correction directly utilizes the ground control points to establish a mathematical model between the pixel coordinates and the geographic coordinates of the target object, and realizes the transformation of the pixel positions in different coordinate systems.
And after each remote sensing image is corrected, combining the corrected remote sensing images into a corrected image sequence.
And secondly, fusing the corrected images in the corrected image sequence into a target corrected image.
The image fusion is an image processing technology for generating a high-resolution multispectral image remote sensing by resampling a multispectral image with low resolution and a single-waveband image with high resolution, so that the processed image has high spatial resolution and multispectral characteristics. The execution subject may perform a fusion process on the corrected image to obtain a target corrected image.
And obtaining a target correction image after each correction image is subjected to the image fusion processing.
And thirdly, obtaining an initial image to be processed based on the target correction image sequence corresponding to the correction image sequence.
The single view occlusion coverage of the remote sensing image is limited, especially for high resolution remote sensing images. In many cases, many scene images are often required to complete coverage of the entire study area. At this point, the executing subject needs to seamlessly stitch the different image files into a complete image containing the study area, which is the stitching of the images. Through the splicing processing, a ground image which cannot be obtained by a single sensor and covers a wider range can be obtained. The images participating in the splicing can be multisource images, can be acquired by the same sensor at different times, and can also be acquired by different sensors at different times, but certain overlapping degree is required to be formed between the spliced images, and the spliced images have the same wave band number. Before splicing, the multi-source images are required to be registered, and after splicing, seams among the original images are eliminated through subsequent processing. The former is achieved by geometric correction and the latter by dodging of the image.
Image stitching generally includes the following several main processes:
image positioning: i.e. to the geometrical registration between adjacent images, for the purpose of determining the overlapping area of the images. Whether the accuracy of the determination of the overlapping area directly influences the image splicing effect.
Color balance: color balance is a key link in the digital splicing technology of remote sensing images. Images with different time phases or imaging conditions have different brightness differences because of different radiation levels of the images to be spliced, and several images spliced together without tone adjustment cannot be well applied to various specialties because of different tones even though the geometric position registration is ideal. In addition, images with close imaging time phases and imaging conditions may have inconsistent image tones due to random sensor errors. Thereby affecting the effect of the application and therefore, color tone adjustment must be performed, including color balance within the image and color balance between images.
And (3) seam line treatment: seam line processing can be subdivided into search for seam lines in overlapping areas and elimination of stitching. The quality of the seam line treatment directly affects the effect of the stitched image. In the splicing process, even if the two images are subjected to color tone adjustment, the color tones at the seams of the two images cannot be completely consistent, and therefore, the overlapped areas of the images need to be subjected to color tone smoothing to eliminate the splicing seams.
For the images which are positioned by the geographic coordinates, a splicing mode based on the geographic coordinates is adopted, and the overlapped area between the images is obtained by calculating the coordinates of the images; images not containing geocoding need to be spliced based on pixel elements, and the overlapping area can be determined through feature point matching between the images or manual designation. After the spliced image is obtained, the execution main body can cut the spliced image to obtain an initial image to be processed.
In some optional implementations of some embodiments, the remote sensing image comprises a corresponding low-resolution multispectral image and a high-resolution single-band image; and, for the correction images in the correction image sequence, fusing the correction images into a target correction image, may include the steps of:
the first step is to spatially register the low-resolution multispectral image and the high-resolution single-band image included in the corrected image.
Wherein the spatial registration may be used to match the locations of the same target within the low resolution multispectral image and the high resolution single band image.
And secondly, fusing the low-resolution multispectral image and the high-resolution single-band image after spatial registration to obtain a target correction image.
The image fusion process can be divided into two processes: data preparation and image data fusion.
a. Data preparation
Firstly, collecting an original remote sensing image to be fused, and carrying out appropriate pretreatment on the remote sensing image: removing problematic scanning lines and noises in the original image so as to increase the image quality and improve the fusion effect; the image fusion range is cut down, so that the number of fusion pixels can be reduced, and the speed is improved; and most importantly the images to be fused are spatially registered. High-precision registration of multi-source image data before fusion is a very critical factor for improving fusion quality.
b. Image data fusion
When image data fusion is carried out, a proper fusion method is selected according to actual needs and fusion purposes and is carried out according to the principles and steps of various methods. In the fusion process, each step of transformation has a series of parameters to be determined and selected, and the parameters can influence the final fusion effect, so that a fusion algorithm also needs to carry out a plurality of tests, meanwhile, different fusion methods also need to be compared, and then the most appropriate fusion method and the parameters selected during fusion can be determined. The fused remote sensing images obtained by various algorithms can be further processed according to actual needs, such as matching processing, type transformation and the like, so that the researched target can be more clearly represented. Common fusion methods include principal component transformation, product transformation, BROVEY transformation, and wavelet transformation.
In some optional implementations of some embodiments, the obtaining an initial to-be-processed image based on a target corrected image sequence corresponding to the corrected image sequence may include:
the first step is to determine an overlapping image area of adjacent images in the target correction image sequence, and obtain at least one overlapping image area corresponding to the target correction image sequence.
In order to obtain the initial image to be processed, the execution subject needs to determine an overlapping image region of adjacent images in the target correction image sequence. For example, the execution subject may first determine at least one feature point in the adjacent image, and then perform an overlay operation on the adjacent image with reference to the at least one feature point, and may determine an overlapping image area.
In a second step, a global image is constructed based on the at least one overlapping image area.
The execution subject may construct a global image based on the at least one overlapping image region. For example, adjacent images in the target correction image sequence are acquired at set intervals, while the remote sensing satellite is in motion, and therefore the image content of the adjacent images is not exactly the same. After the overlapping image area is determined, the execution subject may expand the non-overlapping image area to make the image larger and larger, thereby obtaining a global image.
And thirdly, cutting out an initial image to be processed corresponding to the target area from the global image.
Generally, the image obtained by the remote sensing image contains a large area, and therefore, the global image may contain an image without a target area. The execution subject may crop the global image based on the target area, resulting in an initial to-be-processed image.
In some optional implementations of some embodiments, the determining the overlapping image region of the adjacent images in the target corrected image sequence may include:
firstly, determining an image area corresponding to the target area in each target correction image in the target correction image sequence.
The execution subject may first determine an image region corresponding to the target region in each target correction image in the sequence of target correction images.
And secondly, geometrically registering two adjacent target correction images in the target correction image sequence, and determining an overlapped image area of the two target correction images.
The execution subject may identify an image area where two adjacent target correction images are the same, and overlap the two adjacent target correction images based on the same image area, thereby determining an overlapped image area of the two target correction images.
Step 302, determining at least one large-scale disaster area to be processed from the initial image to be processed.
In some embodiments, after obtaining the initial to-be-processed image, the executing subject may identify an image region containing large-scale disaster features from the initial to-be-processed image, and determine at least one large-scale to-be-processed disaster region.
In some optional implementations of some embodiments, the determining at least one large-scale disaster area to be processed from the initial image to be processed may include:
firstly, acquiring a reference large-scale feature of the target area.
The execution subject can perform image analysis on the initial image to be processed to obtain the reference large-scale feature. Wherein the reference large scale feature may comprise at least one of: color, shape, texture, spatial location.
And secondly, importing the initial image to be processed into a pre-trained remote sensing information model to obtain the current characteristics corresponding to the target area.
The remote sensing information model can be a model obtained by a technician according to historical remote sensing image training. For example, the executive body may train the remote sensing information model by using the historical remote sensing image as an input of model training and using the feature information corresponding to the historical remote sensing image as an output of the model training. The execution subject can import the initial image to be processed into the remote sensing information model to obtain the corresponding current characteristics.
And thirdly, comparing the current characteristic with the reference large-scale characteristic, and determining at least one large-scale disaster area to be processed corresponding to the target area.
If the current feature is the same as the reference large-scale feature, it can be considered that the target area has no geological disaster, and if the current feature is greatly different from the reference large-scale feature, it can be considered that the target area has the possibility of geological disaster, and the execution main body can further determine at least one large-scale disaster area to be processed corresponding to the target area.
Step 303, in response to the existence of a real-scene image containing a small-scale disaster feature in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one real-scene image.
And 304, monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area.
The contents of step 303 and step 304 are the same as those of step 202 and step 203, and are not described in detail here.
With further reference to fig. 4, a flow 400 of still further embodiments of disaster information acquisition methods is illustrated. The process 400 of the disaster information acquisition method includes the following steps:
step 401, in response to the existence of a remote sensing image containing large-scale disaster features in a remote sensing image sequence of a target area, determining at least one large-scale disaster area to be processed from the remote sensing image sequence.
The content of step 401 is the same as that of step 201, and is not described in detail here.
Step 402, in response to the existence of a real-scene image containing a small-scale disaster feature in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, acquiring geological feature information corresponding to the real-scene image for the real-scene image in the at least one real-scene image.
In some embodiments, the executive body may obtain geological feature information corresponding to the live-action image, the geological feature information including at least one of: spatial position information, formation lithology information and geological boundary information.
In some optional implementations of some embodiments, the obtaining of the geological feature information corresponding to the live-action image may include:
firstly, acquiring geological information corresponding to the live-action image.
The execution subject may acquire geological information of the live-action image. Wherein the geological information comprises at least one of: geological points, geological lines, geological surfaces, and geologic bodies.
And secondly, obtaining the geological feature information based on the geological information and the reference geological information corresponding to the live-action image.
The execution main body can compare the geological information with the reference geological information corresponding to the live-action image, and when the geological information is different from the reference geological information, different geological information is marked to obtain geological feature information.
In some optional implementations of some embodiments, the obtaining of the geological information corresponding to the live-action image may include:
firstly, acquiring image data and point cloud data corresponding to the live-action image.
To obtain actual geological information, the executive subject may first obtain image data and point cloud data corresponding to the live-action image. For example, the drone may acquire image data and point cloud data corresponding to a live-action image, and then the drone transmits the acquired image data and point cloud data to the execution subject.
And secondly, respectively obtaining a three-dimensional model and a digital elevation model of the live-action image through the image data and the point cloud data.
After the image data and the point cloud data are obtained, the execution subject can construct a three-dimensional model through the image data. And further obtaining a digital elevation model through the point cloud data. Here, the image data of the present application may be acquired through a plurality of shots. For example, the image data may be acquired through a D2M lens. The D2M lens comprises 5 lenses, the middle of the lenses is an orthographic lens, shooting is performed in a mode of being perpendicular to the ground surface during operation, and 4 inclined lenses are distributed in the east, south, west and north directions and shoot in a posture of being inclined by 45 degrees. And then, importing the image data into a Context Capture to construct a three-dimensional model.
And thirdly, determining geological information based on the three-dimensional model and the digital elevation model.
After the three-dimensional model and the digital elevation model are obtained, the execution main body can perform structural analysis on the three-dimensional model and the digital elevation model so as to determine geological information.
In some optional implementations of some embodiments, the determining geological information based on the three-dimensional stereo model and the digital elevation model may include:
and step one, fusing the three-dimensional model and the digital elevation model to obtain the geological features corresponding to the live-action image.
The execution subject can perform information fusion on the three-dimensional stereo model and the digital elevation model to obtain the geological features corresponding to the live-action image. Wherein the geological features may include at least one of: geological morphology information, geological shadow information, and geological texture information.
And secondly, determining at least one landform mark information of the live-action image through the geological features.
The geological features may reflect geological features, and therefore, the geomorphic marking information is determined based on the geological features. Wherein the landform sign information may be used to characterize a topographic feature of the live-action image, the topographic feature including at least one of: cracks, abrupt ridges, accumulations.
And thirdly, determining geological information based on the at least one landform mark information.
The execution main body can perform region division according to the landform mark information so as to determine geological information.
In some optional implementation manners of some embodiments, the obtaining the geological feature information based on the reference geological information corresponding to the geological information and the live-action image may include:
the method comprises the first step of determining at least one disaster type based on the geological information and reference geological information corresponding to the live-action image.
The execution subject may compare the geological information with reference geological information corresponding to the live-action image to determine a disaster type. The reference geological information is acquired under the condition that no disaster occurs in the area corresponding to the live-action image, and can represent the normal geological form of the area. Wherein the disaster type may include at least one of: collapse, landslide, debris flow.
A second step of determining the geological feature information based on the at least one disaster type.
After the disaster type is determined, geological feature information corresponding to the disaster type can be determined.
And 403, determining at least one small-scale disaster area to be processed based on the geological feature information.
In some embodiments, the execution subject may perform image region division on the live-action image according to the geological feature information, thereby determining at least one small-scale disaster region to be processed.
And 404, monitoring the at least one small-scale disaster area to be processed in real time, and acquiring disaster environment information in real time.
In order to determine what disaster may occur in the small-scale disaster area to be processed, the execution subject may acquire disaster environment information in real time. Wherein the disaster environment information may include at least one of: rainfall, rainfall speed, soil pressure, surface displacement.
Step 405, for a small-scale disaster area to be processed in the at least one small-scale disaster area to be processed, determining disaster information of the small-scale disaster area to be processed based on the disaster environment information.
After determining the disaster environment information, the execution subject may predict possible disaster information according to the disaster environment information.
In some optional implementations of some embodiments, the determining disaster information of the small-scale disaster area to be processed based on the disaster environment information may include:
firstly, determining the target disaster type of the small-scale disaster area to be processed according to the historical disaster information of the small-scale disaster area to be processed.
In practice, a disaster type in a certain area is usually relatively fixed, for example, a mountain landslide often occurs on a certain mountain slope. For this, the execution subject may first acquire historical disaster information of a small-scale disaster area to be processed, and determine a target disaster type according to the historical disaster information. The target disaster type is a disaster type which often occurs in the small-scale disaster area to be processed, and the target disaster type may be one type or multiple types.
And secondly, determining a disaster state corresponding to the target disaster type according to the disaster environment information.
The execution subject may predict disaster information according to the disaster environment information. In order to improve the accuracy of disaster information, the execution main body can match the pre-judged disaster information with the target disaster type, and if the matching is successful, the accuracy of the pre-judged disaster information is very high; if the matching is unsuccessful, whether the target disaster type or the disaster type predicted by the disaster environment information is selected can be determined according to the weight of the target disaster type. For example, when the weight exceeds 60%, the target disaster type may be selected, and when the weight is less than 60%, the disaster type predicted by the disaster environment information may be selected. The weight of the target disaster type may be determined by the frequency of occurrence of the target disaster type in the historical disaster information, and the like.
Furthermore, the execution main body can also predict the disaster degree according to the specific values of rainfall, rainfall speed, soil pressure and earth surface displacement in the disaster environment information. For example, in a certain area, a slope has no obvious sign of deformation and damage within 0-20min (min), the corresponding accumulated displacement X is about 0.5cm (centimeter) near 20min, the accumulated rainfall h is about 23.7mm (millimeter), and the slope is in a basically stable state; the slope is obviously deformed and damaged within 20-40min, the displacement rate is obviously increased, the corresponding accumulated displacement X near the 40min node is about 5.8cm, the accumulated rainfall h is about 47.5mm, and the slope is in an under-stable state; the short 'stabilization phase' is passed in the time period of 40-60min, and the displacement is stabilized at 5.8cm; when the slope is in 60min, the displacement quantity suddenly increases, the displacement rate rapidly increases, integral instability damage begins to occur, the accumulated rainfall h reaches 71.2mm, and the slope is in an unstable state. Therefore, the warning level and the form of the warning for the landslide of the area are shown in table 1.
Figure DEST_PATH_IMAGE020
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a disaster information acquisition apparatus, which correspond to those shown in fig. 2, and which may be applied in various electronic devices in particular.
As shown in fig. 5, a disaster information acquisition apparatus 500 of some embodiments includes: a first disaster area determination unit 501, a second disaster area determination unit 502, and a disaster information determination unit 503. The first disaster area determining unit 501 is configured to determine at least one large-scale disaster area to be processed from a remote sensing image sequence of a target area in response to the existence of a remote sensing image containing large-scale disaster features in the remote sensing image sequence; a second disaster area determination unit 502 configured to determine at least one small-scale disaster area to be processed based on at least one live-action image corresponding to the at least one large-scale disaster area in response to the presence of the live-action image containing a small-scale disaster feature in the at least one live-action image; a disaster information determining unit 503 configured to monitor the at least one small-scale disaster area to be processed in real time, and determine disaster information of the target area.
In an optional implementation manner of some embodiments, the first disaster area determination unit 502 may include: an initial to-be-processed image acquisition sub-unit (not shown in the drawing) and a first disaster area determination sub-unit (not shown in the drawing). The remote sensing image sequence acquisition subunit is configured to acquire a remote sensing image sequence; a first disaster area determination subunit configured to determine at least one large-scale disaster area to be processed from the initial image to be processed.
In an optional implementation manner of some embodiments, the initial to-be-processed image acquiring subunit may include: a corrected image sequence acquisition module (not shown in the figure), a target corrected image fusion module (not shown in the figure) and an initial to-be-processed image acquisition module (not shown in the figure). The correction image sequence acquisition module is configured to correct each remote sensing image in the remote sensing image sequence through a preset correction operation to obtain a correction image sequence corresponding to the remote sensing image sequence, wherein the preset correction operation comprises at least one of the following operations: radiation correction and geometric correction; a target corrected image fusion module configured to fuse, for a corrected image in the sequence of corrected images, the corrected image into a target corrected image; and the initial image to be processed acquisition module is configured to obtain an initial image to be processed based on a target correction image sequence corresponding to the correction image sequence.
In an optional implementation of some embodiments, the remote sensing image comprises a corresponding low-resolution multispectral image and a high-resolution single-band image; and, the target correction image fusion module may include: a spatial registration sub-module (not shown) and a target correction image acquisition sub-module (not shown). The spatial registration sub-module is configured to spatially register the low-resolution multispectral image and the high-resolution single-waveband image included in the correction image, and the spatial registration is used for matching the positions of the same target in the low-resolution multispectral image and the high-resolution single-waveband image; and the target correction image acquisition sub-module is configured to fuse the low-resolution multispectral image and the high-resolution single-waveband image after spatial registration to obtain a target correction image.
In an optional implementation manner of some embodiments, the initial to-be-processed image acquiring module may include: an overlapping image area determination sub-module (not shown), a global image construction sub-module (not shown), and an initial to-be-processed image acquisition sub-module (not shown). The overlapped image area determining submodule is configured to determine an overlapped image area of adjacent images in the target correction image sequence, so as to obtain at least one overlapped image area corresponding to the target correction image sequence; a global image construction sub-module configured to construct a global image based on the at least one overlapping image region; and the initial to-be-processed image acquisition sub-module is configured to cut out an initial to-be-processed image corresponding to the target area from the global image.
In an optional implementation of some embodiments, the overlapping image region determination sub-module may include: an image area determination module (not shown) and an overlapping image area determination module (not shown). The image area determining module is configured to determine an image area corresponding to the target area in each target correction image in the target correction image sequence; and the overlapped image area determining module is configured to perform geometric registration on two adjacent target correction images in the target correction image sequence and determine the overlapped image areas of the two target correction images.
In an optional implementation of some embodiments, the first disaster area determination subunit may include: a reference large-scale feature acquisition module (not shown in the figure), a current feature acquisition module (not shown in the figure), and a large-scale disaster area to be processed determination module (not shown in the figure). Wherein the reference large-scale feature acquisition module is configured to acquire a reference large-scale feature of the target region, and the reference large-scale feature comprises at least one of the following: color, shape, texture, spatial location; the current characteristic acquisition module is configured to lead the initial image to be processed into a pre-trained remote sensing information model to obtain current characteristics corresponding to the target area; and the large-scale disaster area to be processed determining module is configured to compare the current characteristic with the reference large-scale characteristic and determine at least one large-scale disaster area to be processed corresponding to the target area.
In an optional implementation manner of some embodiments, the second disaster area determination unit 502 may include: a geological feature information acquisition subunit (not shown in the figure) and a small-scale disaster area to be processed determination subunit (not shown in the figure). The geological feature information acquiring subunit is configured to acquire geological feature information corresponding to a live-action image in the at least one live-action image, where the geological feature information includes at least one of the following: spatial position information, stratum lithology information and geological boundary information; a small-scale disaster area to be processed determining subunit configured to determine at least one small-scale disaster area to be processed based on the geological feature information.
In an optional implementation manner of some embodiments, the geological feature information obtaining subunit may include: a geological information acquisition module (not shown) and a geological feature information acquisition module (not shown). The geological information acquisition module is configured to acquire geological information corresponding to the live-action image, and the geological information includes at least one of the following: a geological point, a geological line, a geological surface, a geological body; and the geological feature information acquisition module is configured to obtain the geological feature information based on the reference geological information corresponding to the geological information and the live-action image.
In an optional implementation of some embodiments, the geological information acquisition module may include: an information acquisition sub-module (not shown), a model acquisition sub-module (not shown), and a geological information acquisition sub-module (not shown). The information acquisition submodule is configured to acquire image data and point cloud data corresponding to the live-action image; a model obtaining sub-module configured to obtain a three-dimensional model and a digital elevation model of the live-action image respectively through the image data and the point cloud data; a geological information acquisition sub-module configured to determine geological information based on the three-dimensional stereo model and the digital elevation model.
In an optional implementation manner of some embodiments, the geological information acquisition sub-module may include: a geological feature acquisition module (not shown), a geomorphic marking information determination module (not shown), and a geological information determination module (not shown). The geological feature acquisition module is configured to obtain a geological feature corresponding to the live-action image by fusing the three-dimensional stereo model and the digital elevation model, wherein the geological feature acquisition module comprises at least one of the following: geological morphology information, geological shadow information, geological texture information; a geomorphic marking information determination module configured to determine at least one geomorphic marking information of the live-action image according to the geological features, wherein the geomorphic marking information is used for characterizing the topographic features of the live-action image, and the topographic features comprise at least one of the following items: cracks, abrupt ridges, and accumulation; a geological information determination module configured to determine geological information based on the at least one landform marker information.
In an optional implementation manner of some embodiments, the geological feature information obtaining module may include: a disaster type determination sub-module (not shown) and a geologic feature information determination sub-module (not shown). Wherein the disaster type determination submodule is configured to determine at least one disaster type based on the geological information and the reference geological information corresponding to the live-action image, and the disaster type includes at least one of: collapse, landslide, debris flow; a geologic feature information determination submodule configured to determine the geologic feature information based on the at least one disaster type.
In an optional implementation manner of some embodiments, the disaster information determination unit 503 may include: a disaster environment information real-time acquisition sub-unit (not shown in the drawings) and a disaster information determination sub-unit (not shown in the drawings). Wherein the disaster environment information real-time acquisition subunit is configured to acquire disaster environment information in real time, and the disaster environment information includes at least one of: rainfall, rainfall speed, soil pressure, surface displacement; a disaster information determination subunit configured to determine, for a small-scale disaster area to be processed of the at least one small-scale disaster area, disaster information of the small-scale disaster area to be processed based on the disaster environment information.
In an optional implementation of some embodiments, the disaster information determination subunit may include: a target disaster type determination module (not shown in the figure) and a disaster status determination module (not shown in the figure). The target disaster type determination module is configured to determine a target disaster type of the small-scale disaster area to be processed according to historical disaster information of the small-scale disaster area to be processed; a disaster status determination module configured to determine a disaster status corresponding to the target disaster type according to the disaster environment information.
It will be understood that the units described in the apparatus 500 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 500 and the units included therein, and are not described herein again.
As shown in fig. 6, the electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be alternatively implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 609, or installed from the storage device 608, or installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may be separate and not incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: responding to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determining at least one large-scale disaster area to be processed from the remote sensing image sequence; in response to the existence of a real-scene image containing small-scale disaster features in at least one real-scene image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one real-scene image; and monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, which may be described as: a processor includes a first disaster area determination unit, a second disaster area determination unit, and a disaster information determination unit. Here, the names of these units do not constitute a limitation on the unit itself in some cases, and for example, the disaster information determination unit may also be described as a "unit for the user to acquire disaster information".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A disaster information acquisition method comprises the following steps:
responding to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determining at least one large-scale disaster area to be processed from the remote sensing image sequence, wherein the remote sensing image sequence is obtained through a remote sensing satellite;
in response to the fact that a live-action image containing small-scale disaster features exists in at least one live-action image corresponding to the at least one large-scale disaster area to be processed, determining at least one small-scale disaster area to be processed based on the at least one live-action image, wherein the live-action image is an image corresponding to the large-scale disaster area to be processed and acquired by an unmanned aerial vehicle;
monitoring the at least one small-scale disaster area to be processed in real time, and determining disaster information of the target area, wherein the disaster information is acquired through a measurement tower on the site of the target area;
the determining at least one small-scale disaster area to be processed based on the at least one live-action image comprises:
for a live-action image in the at least one live-action image, obtaining geological feature information corresponding to the live-action image, wherein the geological feature information comprises at least one of the following items: spatial position information, stratum lithology information and geological boundary information;
determining at least one small-scale disaster area to be processed based on the geological feature information;
the acquiring of the geological feature information corresponding to the live-action image includes:
obtaining geological information corresponding to the live-action image, wherein the geological information comprises at least one of the following items: a geological point, a geological line, a geological surface, a geological body;
obtaining the geological feature information based on the geological information and the reference geological information corresponding to the live-action image;
the obtaining of the geological information corresponding to the live-action image includes:
acquiring image data and point cloud data corresponding to the live-action image;
respectively obtaining a three-dimensional model and a digital elevation model of the live-action image through the image data and the point cloud data;
and determining geological information based on the three-dimensional model and the digital elevation model.
2. The method of claim 1, wherein the determining at least one large-scale disaster area to be treated from the sequence of remotely sensed images comprises:
obtaining an initial image to be processed based on the remote sensing image sequence;
determining at least one large-scale disaster area to be processed from the initial image to be processed.
3. The method of claim 2, wherein said deriving an initial to-be-processed image based on the sequence of remotely sensed images comprises:
correcting each remote sensing image in the remote sensing image sequence through a preset correction operation to obtain a corrected image sequence corresponding to the remote sensing image sequence, wherein the preset correction operation comprises at least one of the following operations: radiation correction and geometric correction;
for the correction images in the correction image sequence, fusing the correction images into a target correction image;
and obtaining an initial image to be processed based on the target correction image sequence corresponding to the correction image sequence.
4. The method of claim 3, wherein the remote sensing images include corresponding low resolution multispectral images and high resolution single band images; and
for the corrected images in the corrected image sequence, fusing the corrected images into a target corrected image, including:
carrying out spatial registration on the multispectral image with low resolution and the single-waveband image with high resolution included in the corrected image, wherein the spatial registration is used for matching the positions of the same target in the multispectral image with low resolution and the single-waveband image with high resolution;
and fusing the low-resolution multispectral image and the high-resolution single-band image after spatial registration to obtain a target correction image.
5. The method of claim 3, wherein said deriving an initial to-be-processed image based on a target sequence of corrected images corresponding to the sequence of corrected images comprises:
determining an overlapping image area of adjacent images in the target correction image sequence to obtain at least one overlapping image area corresponding to the target correction image sequence;
constructing a global image based on the at least one overlapping image region;
and cutting out an initial image to be processed corresponding to the target area from the global image.
6. The method of claim 5, wherein the determining overlapping image regions of adjacent images in the target correction image sequence comprises:
determining an image area corresponding to the target area in each target correction image in the target correction image sequence;
and geometrically registering two adjacent target correction images in the target correction image sequence, and determining an overlapped image area of the two target correction images.
7. The method of claim 2, wherein the determining at least one large-scale pending disaster area from the initial pending image comprises:
acquiring a reference large-scale feature of the target area, wherein the reference large-scale feature comprises at least one of the following items: color, shape, texture, spatial location;
importing the initial image to be processed into a pre-trained remote sensing information model to obtain the current characteristics corresponding to the target area;
and comparing the current characteristic with the reference large-scale characteristic, and determining at least one large-scale disaster area to be processed corresponding to the target area.
8. A disaster information acquisition apparatus comprising:
the disaster area processing method comprises a first disaster area determining unit, a second disaster area determining unit and a third disaster area determining unit, wherein the first disaster area determining unit is configured to respond to the existence of remote sensing images containing large-scale disaster features in a remote sensing image sequence of a target area, and determine at least one large-scale disaster area to be processed from the remote sensing image sequence, and the remote sensing image sequence is obtained through a remote sensing satellite;
a second disaster area determination unit configured to determine at least one small-scale disaster area to be processed based on at least one live-action image in response to a presence of a live-action image containing a small-scale disaster feature in the at least one live-action image corresponding to the at least one large-scale disaster area to be processed, the live-action image being an image corresponding to the large-scale disaster area to be processed acquired by the unmanned aerial vehicle;
a disaster information determination unit configured to monitor the at least one small-scale disaster area to be processed in real time, and determine disaster information of the target area, the disaster information being obtained through a measurement tower on site of the target area;
the second disaster area determination unit includes:
a geological feature information obtaining subunit configured to, for a live-action image of the at least one live-action image, obtain geological feature information corresponding to the live-action image, where the geological feature information includes at least one of: spatial position information, stratum lithology information and geological boundary information;
a small-scale disaster area to be processed determining subunit configured to determine at least one small-scale disaster area to be processed based on the geological feature information;
the geological feature information acquisition subunit includes:
a geological information acquisition module configured to acquire geological information corresponding to the live-action image, wherein the geological information includes at least one of the following: a geological point, a geological line, a geological surface, a geological body;
a geological feature information acquisition module configured to obtain the geological feature information based on the geological information and reference geological information corresponding to the live-action image;
the geological information acquisition module comprises:
the information acquisition sub-module is configured to acquire image data and point cloud data corresponding to the live-action image;
a model obtaining sub-module configured to obtain a three-dimensional model and a digital elevation model of the live-action image respectively through the image data and the point cloud data;
a geological information acquisition sub-module configured to determine geological information based on the three-dimensional stereo model and the digital elevation model.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored which, when executed by a processor, carries out the method of any one of claims 1 to 7.
CN202211341353.9A 2022-10-31 2022-10-31 Disaster information acquisition method and device, electronic equipment and computer readable medium Active CN115410095B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211341353.9A CN115410095B (en) 2022-10-31 2022-10-31 Disaster information acquisition method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211341353.9A CN115410095B (en) 2022-10-31 2022-10-31 Disaster information acquisition method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN115410095A CN115410095A (en) 2022-11-29
CN115410095B true CN115410095B (en) 2023-01-31

Family

ID=84167622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211341353.9A Active CN115410095B (en) 2022-10-31 2022-10-31 Disaster information acquisition method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN115410095B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115644A (en) * 2023-08-08 2023-11-24 江苏省地质调查研究院 Disaster analysis method and device based on image data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113240688A (en) * 2021-06-01 2021-08-10 安徽建筑大学 Integrated flood disaster accurate monitoring and early warning method
CN113705429A (en) * 2021-08-26 2021-11-26 广东电网有限责任公司广州供电局 Landslide geological disaster remote sensing interpretation method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683096B (en) * 2017-01-25 2019-09-03 中国科学院寒区旱区环境与工程研究所 Permafrost hazards information extracting method and device based on satellite remote-sensing image
CN111626269B (en) * 2020-07-07 2021-08-27 中国科学院空天信息创新研究院 Practical large-space-range landslide extraction method
CN112598881B (en) * 2020-12-03 2022-03-25 中煤航测遥感集团有限公司 Geological disaster monitoring method and device and computer equipment
CN113012398A (en) * 2021-02-20 2021-06-22 中煤航测遥感集团有限公司 Geological disaster monitoring and early warning method and device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113240688A (en) * 2021-06-01 2021-08-10 安徽建筑大学 Integrated flood disaster accurate monitoring and early warning method
CN113705429A (en) * 2021-08-26 2021-11-26 广东电网有限责任公司广州供电局 Landslide geological disaster remote sensing interpretation method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种山区滑坡灾损建筑物信息提取的方法;李灵芝等;《北京测绘》;20171025(第05期);全文 *
泥石流调查遥感解译新方法研究;冯杭建等;《中国地质灾害与防治学报》;20080915(第03期);第1节到第4节 *

Also Published As

Publication number Publication date
CN115410095A (en) 2022-11-29

Similar Documents

Publication Publication Date Title
Goslee Analyzing remote sensing data in R: the landsat package
Tang et al. Triple linear-array image geometry model of ZiYuan-3 surveying satellite and its validation
CN112017224B (en) SAR data area network adjustment processing method and system
Tan et al. A comparison of radiometric correction techniques in the evaluation of the relationship between LST and NDVI in Landsat imagery
CN113284171B (en) Vegetation height analysis method and system based on satellite remote sensing stereo imaging
CN113920438B (en) Method for checking hidden danger of trees near power transmission line by combining ICESat-2 and Jilin image I
Zheng et al. Spatial variability of terrestrial laser scanning based leaf area index
EP3469516B1 (en) Method and system for improving the resolution of sensor data
US20230064454A1 (en) System and method for generating soil moisture data from satellite imagery using deep learning model
CN108961199A (en) Multi- source Remote Sensing Data data space-time fusion method and device
CN115410095B (en) Disaster information acquisition method and device, electronic equipment and computer readable medium
CN116754076B (en) Inversion method for high-heterogeneity surface temperature of urban complex three-dimensional scene
CN116519913B (en) GNSS-R data soil moisture monitoring method based on fusion of satellite-borne and foundation platform
CN116519557B (en) Aerosol optical thickness inversion method
CN103837138B (en) Precise photogrammetry robot
CN107784277B (en) Mountain fire identification method and system
CN113888416A (en) Processing method of satellite remote sensing image data
CN116468869A (en) Live-action three-dimensional modeling method, equipment and medium based on remote sensing satellite image
CN110428013B (en) Crop remote sensing classification method and system
Honda et al. Real-time volcano activity mapping using ground-based digital imagery
JP2019046149A (en) Crop cultivation support apparatus
CN106780323B (en) Agricultural condition acquisition and real-time updating method and system based on smart phone
CN116452461A (en) Remote sensing data processing method and device, electronic equipment and storage medium
CN111368716A (en) Geological disaster catastrophe farmland extraction method based on multi-source time-space data
CN113936009B (en) Cloud shadow removing method, device and equipment for meteorological satellite flood monitoring

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant