WO2020221284A1 - 一种流域性洪涝场景的无人机监测方法及系统 - Google Patents

一种流域性洪涝场景的无人机监测方法及系统 Download PDF

Info

Publication number
WO2020221284A1
WO2020221284A1 PCT/CN2020/087706 CN2020087706W WO2020221284A1 WO 2020221284 A1 WO2020221284 A1 WO 2020221284A1 CN 2020087706 W CN2020087706 W CN 2020087706W WO 2020221284 A1 WO2020221284 A1 WO 2020221284A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote sensing
uav
sensing image
river basin
scene
Prior art date
Application number
PCT/CN2020/087706
Other languages
English (en)
French (fr)
Inventor
张金良
雷添杰
付健
罗秋实
陈翠霞
Original Assignee
黄河勘测规划设计研究院有限公司
中国水利水电科学研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 黄河勘测规划设计研究院有限公司, 中国水利水电科学研究院 filed Critical 黄河勘测规划设计研究院有限公司
Priority to GB2019092.2A priority Critical patent/GB2590192B/en
Publication of WO2020221284A1 publication Critical patent/WO2020221284A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the invention relates to the technical field of flood monitoring, in particular to a drone monitoring method and system for river basin flood scenarios.
  • the purpose of the present invention is to provide an unmanned aerial vehicle monitoring method and system for a river basin flood scene, so as to realize timely, accurate and low-cost river basin flood disaster scene detection.
  • the present invention provides the following solutions:
  • the present invention provides an unmanned aerial vehicle monitoring method for a river basin flood scene.
  • the monitoring method includes the following steps:
  • multiple remote sensing image sets and multiple point cloud data image sets are acquired;
  • the remote sensing image sets acquired by each second UAV system include the second UAV system in different A plurality of remote sensing images acquired at a time point;
  • the point cloud data image set acquired by each second UAV system includes multiple point cloud data images acquired by the second UAV system at different time points;
  • the splicing remote sensing images in multiple remote sensing image collections into a composite remote sensing image specifically includes:
  • SURF Speeded Up Robust Features
  • HSI Human-Saturation-Intensity (Lightness)
  • an interpolation method is used to splice the remote sensing images in the plurality of remote sensing image sets into a synthetic remote sensing image.
  • the use of the SURF algorithm and the HSI color model to perform coarse matching on remote sensing images in multiple remote sensing image sets to obtain multiple coarse matching points specifically includes:
  • each feature point it is determined whether the feature point is the feature point of the overlapping area of the remote sensing image, and the feature point of the overlapping area is selected as the coarse matching point of the overlapping remote sensing image.
  • the determining the submerged range and submerged depth of each land use type according to the synthetic remote sensing image specifically includes:
  • the digital elevation model of the flood scene in the river basin is compared with the digital elevation model before the flood in the river basin to determine the submerged range and submerged depth of each land use type.
  • the land use type includes one or more of residential land, roads, bridges, and cultivated land.
  • An unmanned aerial vehicle monitoring system for a basin flood scene includes:
  • the first UAV system multiple second UAV systems and ground control and data processing centers;
  • the first drone system and the plurality of second drone systems are wirelessly connected to the ground control and data processing center;
  • the first UAV system includes a first UAV, a video camera and a first wireless data transmission module, and the video camera and the first wireless data transmission module are installed on the first UAV;
  • the video camera is wirelessly connected to the ground control and data processing center through the first wireless data transmission module; the video camera is used to shoot a panoramic video of the river basin flood scene, and pass the panoramic video through all Sending the first wireless data transmission module to the ground control and data processing center;
  • the second UAV system includes a second UAV, a surveying and mapping camera, a lidar, and a second wireless data transmission module;
  • the surveying and mapping camera, the lidar and the second wireless data transmission module are installed on the second drone;
  • the surveying and mapping camera and the lidar are respectively wirelessly connected to the ground control and data processing center through the second wireless data transmission module.
  • the surveying and mapping camera is used to obtain a remote sensing image set of a river basin flood scene, and to combine
  • the remote sensing image set is sent to the ground control and data processing center through the second wireless data transmission module;
  • the lidar is used to obtain a point cloud data image set and pass the point cloud data image set through a second wireless
  • the data transmission module is sent to the ground control and data processing center;
  • the ground control and data processing center is used to obtain the actual scene of river basin flooding, the submerged land use type, and each land use type according to the panoramic video, the remote sensing image set and the point cloud data image set The range and depth of submergence;
  • the first drone of the first drone system and the plurality of second drones of the second drone system are respectively wirelessly connected to the ground control and data processing center; the ground control and data The processing center is also used to control the flight of the first drone and the multiple second drones.
  • the first drone flies over a plurality of the second drones; the plurality of second drones flies side by side at the same height and at equal intervals.
  • the first drone and the plurality of second drones fly in parallel and synchronously along the "8"-shaped flight path.
  • the first unmanned aerial vehicle system further includes: an infrared video camera installed on the first unmanned aerial vehicle, and the infrared video camera communicates with the infrared video camera through the first wireless data transmission module.
  • the ground control and data processing center connection is not limited to:
  • the second unmanned aerial vehicle system further includes: an infrared photographic camera installed on the second unmanned aerial vehicle, and the infrared photographic camera communicates with the station through the second wireless data transmission module.
  • the ground control and data processing center connection is not limited to:
  • the present invention discloses the following technical effects:
  • the invention discloses a UAV monitoring method and system for river basin flood scenes.
  • the present invention is based on the first unmanned aerial vehicle system and multiple second unmanned aerial vehicle systems, acquires panoramic video, remote sensing image and point cloud data image of the river basin flood scene, and then monitors the flood scene in real time according to the panoramic video, and monitors the flood scene in real time according to the remote sensing
  • the submerged land use types, as well as the submerged range and depth of each land use type are extracted from the images and point cloud data images, realizing timely, accurate and low-cost river basin flood disaster scene detection.
  • Fig. 1 is a flow chart of a method for monitoring a river basin flood scene by drones provided by the present invention
  • FIG. 2 is a flowchart of splicing remote sensing images from multiple remote sensing image collections into a composite remote sensing image provided by the present invention
  • Figure 3 is a structural diagram of a UAV monitoring system for a basin flood scene provided by the present invention.
  • the purpose of the present invention is to provide an unmanned aerial vehicle monitoring method and system for a river basin flood scene, so as to realize timely, accurate and low-cost river basin flood disaster scene detection.
  • a drone monitoring method for a basin flood scene includes the following steps:
  • Step 101 Take a panoramic video of a river basin flood scene through the first UAV system.
  • Step 102 Monitor the actual scene of basin floods in real time according to the panoramic video.
  • Step 103 Acquire multiple remote sensing image sets and multiple point cloud data image sets through multiple second drone systems; the remote sensing image sets acquired by each second drone system include the second drone system Multiple remote sensing images acquired at different time points; the point cloud data image set acquired by each second UAV system includes multiple point cloud data images acquired by the second UAV system at different time points.
  • Step 104 stitching the point cloud data images in the plurality of point cloud data image sets into a composite point cloud data image.
  • the splicing method is the same as the method of splicing remote sensing images from multiple remote sensing image collections into synthetic remote sensing images.
  • Step 105 Perform image registration and intuitive comparison based on the synthetic point cloud data image and the land cover point cloud data remote sensing image in the geospatial data cloud database to determine the type of land use submerged.
  • Step 106 splicing remote sensing images in a plurality of remote sensing image sets into a composite remote sensing image.
  • the stitching of the point cloud data images in the multiple point cloud data image collections into a synthetic point cloud data image specifically includes: A. Using the SURF algorithm and the HSI color model to collect multiple remote sensing images Coarse matching of remote sensing images is performed to obtain multiple coarse matching points; B. Use random sampling consensus algorithm to purify multiple coarse matching points to obtain multiple refined coarse matching points; C. Use least squares method for multiple Performing fine matching on the refined coarse matching points to obtain multiple fine matching points; D. Based on the multiple fine matching points, using an interpolation method to splice the remote sensing images in the multiple remote sensing image sets into a composite remote sensing image.
  • A. Use SURF algorithm and HSI color model to coarsely match remote sensing images in multiple remote sensing image sets to obtain multiple coarse matching points, including: A1. Constructing the scale space of each remote sensing image; A2. Establishing and solving The Hessian matrix of the scale space of each remote sensing image is used to obtain multiple feature points of each remote sensing image; A3. Determine the main direction of the feature point; A4. Construct a descriptor for each feature point; A5. According to the HSI color Model, add the color data of each feature point to the descriptor of the feature point to obtain the color descriptor of each feature point; A6.
  • Step 107 Determine the submerged range and submerged depth of each land use type according to the synthetic remote sensing image; specifically including: establishing a digital elevation model of the river basin flood scene based on the synthetic remote sensing image; calculating the digital elevation of the river basin flood scene The model is compared with the digital elevation model before the flood in the watershed to determine the submerged range and submerged depth of each land use type.
  • the present invention also provides an unmanned aerial vehicle monitoring system for a basin flood scene, the monitoring system includes:
  • the first UAV system 1 includes a first UAV, a video camera and a first wireless data transmission module, and the video camera and the first wireless data transmission module are installed on the first UAV.
  • the video camera is wirelessly connected to the ground control and data processing center through the first wireless data transmission module; the video camera is used to shoot a panoramic video of the river basin flood scene, and pass the panoramic video through all
  • the first wireless data transmission module is sent to the ground control and data processing center.
  • the second UAV system 2 includes a second UAV, a surveying and mapping camera, a lidar and a second wireless data transmission module.
  • the surveying and mapping camera, the lidar and the second wireless data transmission module are installed on the second drone.
  • the surveying and mapping camera and the lidar are respectively wirelessly connected to the ground control and data processing center through the second wireless data transmission module.
  • the surveying and mapping camera is used to obtain a remote sensing image set of a river basin flood scene, and to combine
  • the remote sensing image set is sent to the ground control and data processing center through the second wireless data transmission module;
  • the lidar is used to obtain a point cloud data image set and pass the point cloud data image set through a second wireless
  • the data transmission module is sent to the ground control and data processing center 3.
  • the ground control and data processing center is used to obtain the actual scene of river basin flooding, the submerged land use type, and each land use type according to the panoramic video, the remote sensing image set and the point cloud data image set The range and depth of submergence;
  • the first drone of the first drone system and the plurality of second drones of the second drone system are respectively wirelessly connected to the ground control and data processing center; the ground control and data The processing center is also used to control the flight of the first drone and the plurality of second drones; specifically, to control the first drone and the plurality of second drones so that all The first drone flies over a plurality of the second drones; a plurality of the second drones fly side by side at the same height and at equal intervals, and the first drone and the plurality of The second drone flies in parallel and synchronously along the "8"-shaped flight path.
  • the first UAV system 1 further includes: an infrared video camera, the infrared video camera is installed on the first UAV, and the infrared video camera passes through the first wireless data transmission module Connect with the ground control and data processing center 3.
  • the second unmanned aerial vehicle system 2 also includes: an infrared photographic camera installed on the second unmanned aerial vehicle, and the infrared photographic camera is controlled by the ground through the second wireless data transmission module And the data processing center 3 is connected.
  • the UAV monitoring system of the present invention considers that basin floods have the characteristics of wide coverage and rapid outbreak. Therefore, when using UAV remote sensing to monitor, the endurance and load capacity of the unmanned aerial platform should be fully considered.
  • the present invention uses two different types of UAVs to respectively complete large-scale macro monitoring (first UAV system) and small-scale fine monitoring (multiple second UAV systems), and the two are used in combination and complement each other.
  • the specific arrangement is to use the medium-range UAV (the first UAV system) with a longer endurance time and larger load capacity as the main platform to complete large-scale macro-monitoring to short-range short-range endurance and smaller load capacity UAVs (multiple second UAV systems) are used as auxiliary platforms to achieve fine monitoring of watersheds in hard-hit areas.
  • the first drone is equipped with a high-definition surveillance video camera to obtain real-time video data of the disaster area;
  • the second drone is equipped with an aerial surveying and mapping camera to obtain real-time image data of the disaster area; to obtain DEM in the basin area (Digital Elevation Model) data can be equipped with high-precision and lightweight lidar.
  • DEM in the basin area Digital Elevation Model
  • the invention discloses a UAV monitoring method and system for river basin flood scenes.
  • the present invention is based on the first unmanned aerial vehicle system and multiple second unmanned aerial vehicle systems, acquires panoramic video, remote sensing image and point cloud data image of the river basin flood scene, and then monitors the flood scene in real time according to the panoramic video, and monitors the flood scene in real time according to the remote sensing
  • the submerged land use types, as well as the submerged range and depth of each land use type are extracted from the images and point cloud data images, realizing timely, accurate and low-cost river basin flood disaster scene detection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

本发明公开了一种流域性洪涝场景的无人机监测方法及系统。本发明基于第一无人机系统和多个第二无人机系统,获取流域性洪涝场景的全景视频、遥感影像和点云数据影像,然后根据全景视频对洪涝场景进行实时监测,并在遥感影像和点云数据影像中提取淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度,实现了及时、准确且低成本的流域性洪涝灾害场景检测。

Description

一种流域性洪涝场景的无人机监测方法及系统
本申请要求于2019年04月29日提交中国专利局、申请号为201910355047.2、发明名称为“一种流域性洪涝场景的无人机监测方法及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及洪涝监测技术领域,特别涉及一种流域性洪涝场景的无人机监测方法及系统。
背景技术
中国是洪涝灾害频发的国家,其中流域性洪涝灾害尤为频繁,严重危险广大人民群众的生命财产安全。例如,1991年,江淮地区遭受特大流域性洪涝灾害,致使2.3亿人受灾,1930万人被洪水围困,受灾农田425万公顷,毁坏房屋605万间,减产粮食200多亿公斤,直接经济损失达500多亿元(中国国际减灾十年委员会,1991)。因此,深入开展流域性洪涝灾害场景检测研究,不仅是防灾减灾研究的重要组成部分,对流域防洪规划、江河洪泛区土地的合理利用以及区域经济持续发展等方面均具重要意义。
传统的灾害调查与评估主要依据于地面调查,灾情数据受到人为因素影响而“水分较大”,真实性不高。卫星遥感技术虽然在灾害调查与评估方面具有较大的客观性,但卫星遥感技术因为卫星重访周期限制、云覆盖影响的因素,在时效性方面限制较大。载人飞机具有自主性强、机动灵活等特点,至今仍然是一种重要的遥感平台,在自然灾害评估调查等方面发挥着难以替代的作用,但由于使用成本较大,限制了其广泛使用。如何实现及时、准确且低成本的流域性洪涝灾害场景检测成为一个亟待解决的技术问题。
发明内容
本发明的目的是提供一种流域性洪涝场景的无人机监测方法及系统,以实现及时、准确且低成本的流域性洪涝灾害场景检测。
为实现所述目的,本发明提供了如下方案:
本发明提供一种流域性洪涝场景的无人机监测方法,所述监测方法包括如下步骤:
通过第一无人机系统拍摄流域性洪涝场景的全景视频;
根据所述全景视频实时监测流域性洪涝的实际场景;
通过多个第二无人机系统,获取多个遥感影像集和多个点云数据影像集;每个第二无人机系统获取的遥感影像集包括所述第二无人机系统在不同的时间点获取的多个遥感影像;每个第二无人机系统获取的点云数据影像集包括所述第二无人机系统在不同的时间点获取的多个点云数据影像;
将多个所述点云数据影像集中的点云数据影像拼接成合成点云数据影像;
根据所述合成点云数据影像与地理空间数据云数据库中的土地覆盖点云数据遥感影像进行影像配准直观对比,确定淹没的土地利用类型;
将多个遥感影像集中的遥感影像拼接成合成遥感影像;
根据所述合成遥感影像确定每种土地利用类型的淹没范围和淹没深度。
可选的,所述将多个遥感影像集中的遥感影像拼接成合成遥感影像,具体包括:
采用SURF(Speeded Up Robust Features)算法和HSI(Hue-Saturation-Intensity(Lightness))颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点;
采用随机抽样一致性算法对多个所述粗匹配点进行提纯,得到多个提纯后的粗匹配点;
采用最小二乘法对多个所述提纯后的粗匹配点进行精匹配,得到多个精匹配点;
基于多个所述精匹配点,采用插值方法将多个遥感影像集中的遥感影像拼接成合成遥感影像。
可选的,所述采用SURF算法和HSI颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点,具体包括:
构建每个遥感影像的尺度空间;
建立并求解每个遥感影像的尺度空间的黑塞矩阵,得到每个遥感影像的多个特征点;
构造每个特征点的描述子;
根据所述HSI颜色模型,将每个特征点的色彩数据添加至所述特征点的描述子,得到每个特征点的色彩描述子;
根据每个特征点的色彩描述子,确定所述特征点是否为遥感影像重叠区域的特征点,选取重叠区域的特征点作为重叠的遥感影像的粗匹配点。
可选的,所述根据所述合成遥感影像确定每种土地利用类型的淹没范围和淹没深度,具体包括:
根据所述合成遥感影像,建立流域洪涝场景的数字高程模型;
将所述流域洪涝场景的数字高程模型与所述流域的洪涝前的数字高程模型对比,确定每种土地利用类型的淹没范围和淹没深度。
可选的,所述土地利用类型包括居民地、道路、桥梁和耕地中的一种或几种。
一种流域性洪涝场景的无人机监测系统,所述监测系统包括:
第一无人机系统、多个第二无人机系统和地面控制及数据处理中心;
所述第一无人机系统和多个所述第二无人机系统与所述地面控制及数据处理中心无线连接;
所述第一无人机系统包括第一无人机、视频相机和第一无线数据传输模块,所述视频相机和所述第一无线数据传输模块安装在所述第一无人机上;
所述视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心无线连接;所述视频相机用于拍摄所述流域性洪涝场景的全景视频,并将所述全景视频通过所述第一无线数据传输模块发送给所述地面控制及数据处理中心;
所述第二无人机系统包括第二无人机、测绘相机、激光雷达和第二无线数据传输模块;
所述测绘相机、所述激光雷达和所述第二无线数据传输模块安装在所述第二无人机上;
所述测绘相机和所述激光雷达分别通过所述第二无线数据传输模块 与所述地面控制及数据处理中心无线连接,所述测绘相机用于获取流域性洪涝场景的遥感影像集,并将所述遥感影像集通过所述第二无线数据传输模块发送给所述地面控制及数据处理中心;所述激光雷达用于获取点云数据影像集,并将所述点云数据影像集通过第二无线数据传输模块发送给所述地面控制及数据处理中心;
所述地面控制及数据处理中心用于根据所述全景视频、所述遥感影像集和所述点云数据影像集,获取流域性洪涝的实际场景,淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度;
所述第一无人机系统的第一无人机和多个所述第二无人机系统的第二无人机分别与所述地面控制及数据处理中心无线连接;所述地面控制及数据处理中心还用于控制所述第一无人机和多个所述第二无人机的飞行。
可选的,所述第一无人机在多个所述第二无人机的上空飞行;多个所述第二无人机在同一高度并排等间距的飞行。
可选的,所述第一无人机和多个所述第二无人机相互平行且同步的沿“8”字形飞行路线飞行。
可选的,所述第一无人机系统还包括:红外视频相机,所述红外视频相机安装在所述第一无人机上,所述红外视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心连接。
可选的,所述第二无人机系统还包括:红外摄影相机,所述红外摄影相机安装在所述第二无人机上,所述红外摄影相机通过所述第二无线数据传输模块与所述地面控制及数据处理中心连接。
根据本发明提供的具体实施例,本发明公开了以下技术效果:
本发明公开了一种流域性洪涝场景的无人机监测方法及系统。本发明基于第一无人机系统和多个第二无人机系统,获取流域性洪涝场景的全景视频、遥感影像和点云数据影像,然后根据全景视频对洪涝场景进行实时监测,并在遥感影像和点云数据影像中提取淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度,实现了及时、准确且低成本的流域性洪涝灾害场景检测。
说明书附图
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明提供的一种流域性洪涝场景的无人机监测方法的流程图;
图2为本发明提供的将多个遥感影像集中的遥感影像拼接成合成遥感影像的流程图;
图3为本发明提供的一种流域性洪涝场景的无人机监测系统的结构图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的目的是提供一种流域性洪涝场景的无人机监测方法及系统,以实现及时、准确且低成本的流域性洪涝灾害场景检测。
为使本发明的所述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本发明作进一步详细的说明。
如图1所示,一种流域性洪涝场景的无人机监测方法,所述监测方法包括如下步骤:
步骤101,通过第一无人机系统拍摄流域性洪涝场景的全景视频。
步骤102,根据所述全景视频实时监测流域性洪涝的实际场景。
步骤103,通过多个第二无人机系统,获取多个遥感影像集和多个点云数据影像集;每个第二无人机系统获取的遥感影像集包括所述第二无人机系统在不同的时间点获取的多个遥感影像;每个第二无人机系统获取的点云数据影像集包括所述第二无人机系统在不同的时间点获取的多个点云数据影像。
步骤104,将多个所述点云数据影像集中的点云数据影像拼接成合成点云数据影像。其拼接方法与将多个遥感影像集中的遥感影像拼接成合成遥感影像的方法相同。
步骤105,根据所述合成点云数据影像与地理空间数据云数据库中的土地覆盖点云数据遥感影像进行影像配准直观对比,确定淹没的土地利用类型。
步骤106,将多个遥感影像集中的遥感影像拼接成合成遥感影像。如图2所示,所述将多个所述点云数据影像集中的点云数据影像拼接成合成点云数据影像,具体包括:A.采用SURF算法和HSI颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点;B.采用随机抽样一致性算法对多个所述粗匹配点进行提纯,得到多个提纯后的粗匹配点;C.采用最小二乘法对多个所述提纯后的粗匹配点进行精匹配,得到多个精匹配点;D.基于多个所述精匹配点,采用插值方法将多个遥感影像集中的遥感影像拼接成合成遥感影像。
其中,A.采用SURF算法和HSI颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点,具体包括:A1.构建每个遥感影像的尺度空间;A2.建立并求解每个遥感影像的尺度空间的黑塞矩阵,得到每个遥感影像的多个特征点;A3.确定特征点的主方向;A4.构造每个特征点的描述子;A5.根据所述HSI颜色模型,将每个特征点的色彩数据添加至所述特征点的描述子,得到每个特征点的色彩描述子;A6.根据每个特征点的色彩描述子,确定所述特征点是否为遥感影像重叠区域的特征点,选取重叠区域的特征点作为重叠的遥感影像的粗匹配点;A7.对粗匹配点进行RANSAC剔除。
步骤107,根据所述合成遥感影像确定每种土地利用类型的淹没范围和淹没深度;具体包括:根据所述合成遥感影像,建立流域洪涝场景的数字高程模型;将所述流域洪涝场景的数字高程模型与所述流域的洪涝前的数字高程模型对比,确定每种土地利用类型的淹没范围和淹没深度。
如图3所示,本发明还提供一种流域性洪涝场景的无人机监测系统,所述监测系统包括:
第一无人机系统1、多个第二无人机系统2和地面控制及数据处理中 心3;其中,所述第一无人机系统1和多个所述第二无人机系统2分别与所述地面控制及数据处理中心3无线连接。
所述第一无人机系统1包括第一无人机、视频相机和第一无线数据传输模块,所述视频相机和所述第一无线数据传输模块安装在所述第一无人机上。
所述视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心无线连接;所述视频相机用于拍摄所述流域性洪涝场景的全景视频,并将所述全景视频通过所述第一无线数据传输模块发送给所述地面控制及数据处理中心。
所述第二无人机系统2包括第二无人机、测绘相机、激光雷达和第二无线数据传输模块。
所述测绘相机、所述激光雷达和所述第二无线数据传输模块安装在所述第二无人机上。
所述测绘相机和所述激光雷达分别通过所述第二无线数据传输模块与所述地面控制及数据处理中心无线连接,所述测绘相机用于获取流域性洪涝场景的遥感影像集,并将所述遥感影像集通过所述第二无线数据传输模块发送给所述地面控制及数据处理中心;所述激光雷达用于获取点云数据影像集,并将所述点云数据影像集通过第二无线数据传输模块发送给所述地面控制及数据处理中心3。
所述地面控制及数据处理中心用于根据所述全景视频、所述遥感影像集和所述点云数据影像集,获取流域性洪涝的实际场景,淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度;
所述第一无人机系统的第一无人机和多个所述第二无人机系统的第二无人机分别与所述地面控制及数据处理中心无线连接;所述地面控制及数据处理中心还用于控制所述第一无人机和多个所述第二无人机的飞行;具体的,控制所述第一无人机和多个所述第二无人机,使所述第一无人机在多个所述第二无人机的上空飞行;多个所述第二无人机在同一高度并排等间距的飞行,且使所述第一无人机和多个所述第二无人机相互平行且同步的沿“8”字形飞行路线飞行。
为了保障夜间拍摄,所述第一无人机系统1还包括:红外视频相机, 所述红外视频相机安装在所述第一无人机上,所述红外视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心3连接。所述第二无人机系统2还包括:红外摄影相机,所述红外摄影相机安装在所述第二无人机上,所述红外摄影相机通过所述第二无线数据传输模块与所述地面控制及数据处理中心3连接。
可见,本发明的无人机监测系统,考虑到流域性洪涝具有涉及范围广、爆发迅速的特点,因此利用无人机遥感进行监测时应充分考虑无人航空平台的续航时间和载荷能力。本发明采用两种不同型号的无人机分别完成大尺度宏观监测(第一无人机系统)和小尺度精细监测(多个第二无人机系统),两者配套使用,相辅相成。具体安排是以续航时间较长和载荷能力较大的中程无人机(第一无人机系统)作为主要平台,完成大尺度宏观监测,以续航时间较短和载荷能力较小的近程无人机(多个第二无人机系统)作为辅助平台,实现重灾区流域精细监测。为满足洪水实时动态监测的需求,第一无人机上搭载高清监测视频相机,获取灾区的实时视频数据;第二无人机上搭载航空测绘相机,获取灾区的实时影像数据;为获取流域区域内DEM(Digital Elevation Model,数字高程模型)数据,可搭载高精度轻量化激光雷达。为保障夜间也能对灾区进行监测,也需配备相应的红外摄影相机和红外视频相机。本发明的提供的无人机监测系统的各个设备的参数如表1所示。
表1流域性洪涝场景的无人机监测系统的各个设备的参数表
Figure PCTCN2020087706-appb-000001
本发明公开了一种流域性洪涝场景的无人机监测方法及系统。本发明基于第一无人机系统和多个第二无人机系统,获取流域性洪涝场景的全景视频、遥感影像和点云数据影像,然后根据全景视频对洪涝场景进行实时监测,并在遥感影像和点云数据影像中提取淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度,实现了及时、准确且低成本的流域性洪涝灾害场景检测。
本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上 实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处。综上所述,本说明书内容不应理解为对本发明的限制。
提供以上实施例仅仅是为了描述本发明的目的,而并非要限制本发明的范围。本发明的范围由所附权利要求限定。不脱离本发明的精神和原理而做出的各种等同替换和修改,均应涵盖在本发明的范围之内。

Claims (10)

  1. 一种流域性洪涝场景的无人机监测方法,其特征在于,所述监测方法包括如下步骤:
    通过第一无人机系统拍摄流域性洪涝场景的全景视频;
    根据所述全景视频实时监测流域性洪涝的实际场景;
    通过多个第二无人机系统,获取多个遥感影像集和多个点云数据影像集;每个第二无人机系统获取的遥感影像集包括所述第二无人机系统在不同的时间点获取的多个遥感影像;每个第二无人机系统获取的点云数据影像集包括所述第二无人机系统在不同的时间点获取的多个点云数据影像;
    将多个所述点云数据影像集中的点云数据影像拼接成合成点云数据影像;
    根据所述合成点云数据影像与地理空间数据云数据库中的土地覆盖点云数据遥感影像进行影像配准直观对比,确定淹没的土地利用类型;
    将多个遥感影像集中的遥感影像拼接成合成遥感影像;
    根据所述合成遥感影像确定每种土地利用类型的淹没范围和淹没深度。
  2. 根据权利要求1所述的流域性洪涝场景的无人机监测方法,其特征在于,所述将多个遥感影像集中的遥感影像拼接成合成遥感影像,具体包括:
    采用SURF算法和HSI颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点;
    采用随机抽样一致性算法对多个所述粗匹配点进行提纯,得到多个提纯后的粗匹配点;
    采用最小二乘法对多个所述提纯后的粗匹配点进行精匹配,得到多个精匹配点;
    基于多个所述精匹配点,采用插值方法将多个遥感影像集中的遥感影像拼接成合成遥感影像。
  3. 根据权利要求2所述的一种流域性洪涝场景的无人机监测方法,其特征在于,所述采用SURF算法和HSI颜色模型对多个遥感影像集中的遥感影像进行粗匹配,得到多个粗匹配点,具体包括:
    构建每个遥感影像的尺度空间;
    建立并求解每个遥感影像的尺度空间的黑塞矩阵,得到每个遥感影像的多个特征点;
    构造每个特征点的描述子;
    根据所述HSI颜色模型,将每个特征点的色彩数据添加至所述特征点的描述子,得到每个特征点的色彩描述子;
    根据每个特征点的色彩描述子,确定所述特征点是否为遥感影像重叠区域的特征点,选取重叠区域的特征点作为相互重叠的遥感影像的粗匹配点。
  4. 根据权利要求1所述的流域性洪涝场景的无人机监测方法,其特征在于,所述根据所述合成遥感影像确定每种土地利用类型的淹没范围和淹没深度,具体包括:
    根据所述合成遥感影像,建立流域洪涝场景的数字高程模型;
    将所述流域洪涝场景的数字高程模型与流域洪涝前的数字高程模型对比,确定每种土地利用类型的淹没范围和淹没深度。
  5. 根据权利要求1所述的流域性洪涝场景的无人机监测方法,其特征在于,所述土地利用类型包括居民地、道路、桥梁和耕地中的一种或几种。
  6. 一种流域性洪涝场景的无人机监测系统,其特征在于,所述监测系统包括:
    第一无人机系统、多个第二无人机系统和地面控制及数据处理中心;
    所述第一无人机系统和多个所述第二无人机系统分别与所述地面控制及数据处理中心无线连接;
    所述第一无人机系统包括第一无人机、视频相机和第一无线数据传输模块,所述视频相机和所述第一无线数据传输模块安装在所述第一无人机上;
    所述视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心无线连接;所述视频相机用于拍摄所述流域性洪涝场景的全景视频,并将所述全景视频通过所述第一无线数据传输模块发送给所述地面控制及数据处理中心;
    所述第二无人机系统包括第二无人机、测绘相机、激光雷达和第二无线数据传输模块;
    所述测绘相机、所述激光雷达和所述第二无线数据传输模块安装在所述第二无人机上;
    所述测绘相机和所述激光雷达分别通过所述第二无线数据传输模块与所述地面控制及数据处理中心无线连接,所述测绘相机用于获取流域性洪涝场景的遥感影像集,并将所述遥感影像集通过所述第二无线数据传输模块发送给所述地面控制及数据处理中心;所述激光雷达用于获取点云数据影像集,并将所述点云数据影像集通过第二无线数据传输模块发送给所述地面控制及数据处理中心;
    所述地面控制及数据处理中心用于根据所述全景视频、所述遥感影像集和所述点云数据影像集,获取流域性洪涝的实际场景,淹没的土地利用类型,以及每种土地利用类型的淹没范围和淹没深度;
    所述第一无人机系统的第一无人机和多个所述第二无人机系统的第二无人机分别与所述地面控制及数据处理中心无线连接;所述地面控制及数据处理中心还用于控制所述第一无人机和多个所述第二无人机的飞行。
  7. 根据权利要求6所述的流域性洪涝场景的无人机监测系统,其特征在于,所述第一无人机在多个所述第二无人机的上空飞行;多个所述第二无人机在同一高度并排等间距的飞行。
  8. 根据权利要求7所述的流域性洪涝场景的无人机监测系统,其特征在于,所述第一无人机和多个所述第二无人机相互平行且同步的沿“8”字形飞行路线飞行。
  9. 根据权利要求6所述的流域性洪涝场景的无人机监测系统,其特征在于,所述第一无人机系统还包括:红外视频相机,所述红外视频相机安装在所述第一无人机上,所述红外视频相机通过所述第一无线数据传输模块与所述地面控制及数据处理中心连接。
  10. 根据权利要求6所述的流域性洪涝场景的无人机监测系统,其特征在于,所述第二无人机系统还包括:红外摄影相机,所述红外摄影相机安装在所述第二无人机上,所述红外摄影相机通过所述第二无线数据传输模块与所述地面控制及数据处理中心连接。
PCT/CN2020/087706 2019-04-29 2020-04-29 一种流域性洪涝场景的无人机监测方法及系统 WO2020221284A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2019092.2A GB2590192B (en) 2019-04-29 2020-04-29 Unmanned aerial vehicle (UAV)-based monitoring method and system for basin-wide flood

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910355047.2 2019-04-29
CN201910355047.2A CN110110641B (zh) 2019-04-29 2019-04-29 一种流域性洪涝场景的无人机监测方法及系统

Publications (1)

Publication Number Publication Date
WO2020221284A1 true WO2020221284A1 (zh) 2020-11-05

Family

ID=67487370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087706 WO2020221284A1 (zh) 2019-04-29 2020-04-29 一种流域性洪涝场景的无人机监测方法及系统

Country Status (3)

Country Link
CN (1) CN110110641B (zh)
GB (1) GB2590192B (zh)
WO (1) WO2020221284A1 (zh)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348381A (zh) * 2020-11-12 2021-02-09 北京优云智翔航空科技有限公司 一种无人机设备调度数据的处理方法、装置以及服务器
CN113012398A (zh) * 2021-02-20 2021-06-22 中煤航测遥感集团有限公司 地质灾害监测预警方法、装置、计算机设备及存储介质
CN113378700A (zh) * 2021-06-08 2021-09-10 四川农业大学 基于无人机航拍影像的草原鼠害动态监测方法
CN113591714A (zh) * 2021-07-30 2021-11-02 金陵科技学院 一种基于卫星遥感图像洪涝检测方法
CN113612966A (zh) * 2021-07-02 2021-11-05 河南浩宇空间数据科技有限责任公司 一种基于城市低空遥感数据的特定目标物智能判别方法
CN113933871A (zh) * 2021-10-15 2022-01-14 贵州师范学院 基于无人机和北斗定位的洪水灾情检测系统
CN115793093A (zh) * 2023-02-02 2023-03-14 水利部交通运输部国家能源局南京水利科学研究院 堤坝隐伏病险诊断空地一体化装备
CN117036622A (zh) * 2023-10-08 2023-11-10 海纳云物联科技有限公司 融合航拍图像和地面扫描的三维重建方法、装置和设备

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110110641B (zh) * 2019-04-29 2020-11-27 中国水利水电科学研究院 一种流域性洪涝场景的无人机监测方法及系统
CN111753703B (zh) * 2020-06-18 2024-04-30 杭州浙大东南土地研究所有限公司 一种城镇土地边界的监控系统及其监控方法
CN112034736B (zh) * 2020-09-07 2023-05-23 中国航空工业集团公司成都飞机设计研究所 一种低耦合的无人机模拟训练方法及系统
CN112414375B (zh) * 2020-10-08 2021-09-03 武汉大学 一种面向洪涝灾害应急快拼图制作的无人机影像姿态恢复方法
CN112465849B (zh) * 2020-11-27 2022-02-15 武汉大学 一种无人机激光点云与序列影像的配准方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007305077A (ja) * 2006-05-15 2007-11-22 Nec Corp 災害時の物資配給支援方法およびシステム
CN106341667A (zh) * 2016-11-10 2017-01-18 广西师范大学 基于无人机的三维全景视频远程监控系统及图像采集控制方法
CN207720303U (zh) * 2017-12-25 2018-08-10 黑龙江龙飞航空摄影有限公司 基于无人机的图像处理系统
CN109446587A (zh) * 2018-09-30 2019-03-08 中国科学院、水利部成都山地灾害与环境研究所 一种大规模滑坡地质灾害快速数值仿真模拟方法及系统
CN110044338A (zh) * 2019-04-29 2019-07-23 中国水利水电科学研究院 一种溃堤溃坝场景的无人机监测方法及系统
CN110110641A (zh) * 2019-04-29 2019-08-09 中国水利水电科学研究院 一种流域性洪涝场景的无人机监测方法及系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9280748B2 (en) * 2012-06-22 2016-03-08 Knowm Tech, Llc Methods and systems for Anti-Hebbian and Hebbian (AHaH) feature extraction of surface manifolds using
US11313678B2 (en) * 2011-06-30 2022-04-26 The Regents Of The University Of Colorado Remote measurement of shallow depths in semi-transparent media
CN104091369B (zh) * 2014-07-23 2017-02-22 武汉大学 一种无人机遥感影像建筑物三维损毁检测方法
CN107479065B (zh) * 2017-07-14 2020-09-11 中南林业科技大学 一种基于激光雷达的林窗立体结构量测方法
CN108896117A (zh) * 2018-05-10 2018-11-27 北京师范大学 一种遥感水文站监测河流径流的方法
CN109063553B (zh) * 2018-06-22 2021-06-25 中国矿业大学 一种土地整治后农田作物生长缺陷区遥感快速诊断方法
CN109684929A (zh) * 2018-11-23 2019-04-26 中国电建集团成都勘测设计研究院有限公司 基于多源遥感数据融合的陆生植物生态环境监测方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007305077A (ja) * 2006-05-15 2007-11-22 Nec Corp 災害時の物資配給支援方法およびシステム
CN106341667A (zh) * 2016-11-10 2017-01-18 广西师范大学 基于无人机的三维全景视频远程监控系统及图像采集控制方法
CN207720303U (zh) * 2017-12-25 2018-08-10 黑龙江龙飞航空摄影有限公司 基于无人机的图像处理系统
CN109446587A (zh) * 2018-09-30 2019-03-08 中国科学院、水利部成都山地灾害与环境研究所 一种大规模滑坡地质灾害快速数值仿真模拟方法及系统
CN110044338A (zh) * 2019-04-29 2019-07-23 中国水利水电科学研究院 一种溃堤溃坝场景的无人机监测方法及系统
CN110110641A (zh) * 2019-04-29 2019-08-09 中国水利水电科学研究院 一种流域性洪涝场景的无人机监测方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LIU, CHANGJUN ET AL.: "Application of the UAV aerial technique in evaluation of flash flood disasters", CHINA FLOOD & DROUGHT MANAGEMENT, vol. 24, no. 3, 30 June 2014 (2014-06-30), XP055750168, ISSN: 1673-9264, DOI: 10.16867/j.cnki.cfdm.2014.03.001 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348381A (zh) * 2020-11-12 2021-02-09 北京优云智翔航空科技有限公司 一种无人机设备调度数据的处理方法、装置以及服务器
CN113012398A (zh) * 2021-02-20 2021-06-22 中煤航测遥感集团有限公司 地质灾害监测预警方法、装置、计算机设备及存储介质
CN113378700A (zh) * 2021-06-08 2021-09-10 四川农业大学 基于无人机航拍影像的草原鼠害动态监测方法
CN113612966A (zh) * 2021-07-02 2021-11-05 河南浩宇空间数据科技有限责任公司 一种基于城市低空遥感数据的特定目标物智能判别方法
CN113591714A (zh) * 2021-07-30 2021-11-02 金陵科技学院 一种基于卫星遥感图像洪涝检测方法
CN113933871A (zh) * 2021-10-15 2022-01-14 贵州师范学院 基于无人机和北斗定位的洪水灾情检测系统
CN115793093A (zh) * 2023-02-02 2023-03-14 水利部交通运输部国家能源局南京水利科学研究院 堤坝隐伏病险诊断空地一体化装备
CN115793093B (zh) * 2023-02-02 2023-05-16 水利部交通运输部国家能源局南京水利科学研究院 堤坝隐伏病险诊断空地一体化装备
CN117036622A (zh) * 2023-10-08 2023-11-10 海纳云物联科技有限公司 融合航拍图像和地面扫描的三维重建方法、装置和设备
CN117036622B (zh) * 2023-10-08 2024-02-23 海纳云物联科技有限公司 融合航拍图像和地面扫描的三维重建方法、装置和设备

Also Published As

Publication number Publication date
GB202019092D0 (en) 2021-01-20
CN110110641A (zh) 2019-08-09
CN110110641B (zh) 2020-11-27
GB2590192A (en) 2021-06-23
GB2590192B (en) 2023-09-27

Similar Documents

Publication Publication Date Title
WO2020221284A1 (zh) 一种流域性洪涝场景的无人机监测方法及系统
US10324367B2 (en) Aerial panoramic oblique photography apparatus
US11070725B2 (en) Image processing method, and unmanned aerial vehicle and system
WO2021115124A1 (zh) 一种耕地现场边云协同的三维重建方法
CN107247458A (zh) 无人机视频图像目标定位系统、定位方法及云台控制方法
CN105898216B (zh) 一种利用无人机进行的人数计数方法
CN106204443A (zh) 一种基于多目复用的全景无人机系统
CN109961497A (zh) 基于无人机影像的实时三维重建方法
CN107316012A (zh) 小型无人直升机的火灾检测与跟踪方法
CN104978390A (zh) 使用出行路径元数据进行情景感知目标检测
CN109145747A (zh) 一种水面全景图像语义分割方法
CN108965798B (zh) 岸滩鸟类的分布式近距离全景监测终端、系统及布局方法
CN104849274A (zh) 一种基于小型无人机的所检测区域旱情实时检测方法
CN106210647A (zh) 基于航拍构建基站覆盖区域全景影像的方法及系统
CN110044338B (zh) 一种溃堤溃坝场景的无人机监测方法及系统
CN107038714B (zh) 多型视觉传感协同目标跟踪方法
CN112184628B (zh) 一种红外双波图像及云端预警的巡堤防汛查险系统及方法
CN110675484A (zh) 一种基于复眼相机的具有时空一致性的动态三维数字场景构建方法
WO2011019111A1 (ko) 웹 3d를 이용한 해양정보 제공시스템 및 그 방법
Pi et al. Disaster impact information retrieval using deep learning object detection in crowdsourced drone footage
Zhou et al. Application of UAV oblique photography in real scene 3d modeling
CN109712249B (zh) 地理要素增强现实方法及装置
CN108737743A (zh) 基于图像拼接的视频拼接装置及视频拼接方法
CN104318540B (zh) 一种利用cpu与gpu协同的航空影像在线拼接方法
CN104133874B (zh) 基于真彩色点云的街景影像生成方法

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 202019092

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20200429

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20798601

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20798601

Country of ref document: EP

Kind code of ref document: A1