WO2023004260A1 - Deep learning for rain fade prediction in satellite communications - Google Patents

Deep learning for rain fade prediction in satellite communications Download PDF

Info

Publication number
WO2023004260A1
WO2023004260A1 PCT/US2022/073767 US2022073767W WO2023004260A1 WO 2023004260 A1 WO2023004260 A1 WO 2023004260A1 US 2022073767 W US2022073767 W US 2022073767W WO 2023004260 A1 WO2023004260 A1 WO 2023004260A1
Authority
WO
WIPO (PCT)
Prior art keywords
aol
beacon
rain
live
information
Prior art date
Application number
PCT/US2022/073767
Other languages
French (fr)
Inventor
Aidin Ferdowsi KHOSROWSHAHI
David Whitefield
Rob Torres
Original Assignee
Hughes Network Systems, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/453,258 external-priority patent/US20230019771A1/en
Application filed by Hughes Network Systems, Llc filed Critical Hughes Network Systems, Llc
Priority to CA3225182A priority Critical patent/CA3225182A1/en
Publication of WO2023004260A1 publication Critical patent/WO2023004260A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/1851Systems using a satellite or space-based relay
    • H04B7/18513Transmission in a satellite or space-based system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • a deep learning (DL)-based system and method to forecast future rain fade using raw data including images and link power measurements is disclosed.
  • the images may include cloud movements imagery in various spectra from one or more viewpoints.
  • the spectra includes radar, infra-red, radio, ultra-violet and others.
  • the viewpoints may include cloud top-view and cloud bottom-view imagery.
  • the cloud top-view images may be from a fixed or moving satellite, or a high altitude platform.
  • the cloud bottom-view images may be radar images from the ground.
  • Some of the images may include ground conditions, for example, radar images.
  • a gateway diversity strategy utilizing rain fade forecasting improves weather-resiliency and enhances overall network availability. The predictions may predict rain fade for short-term (seconds) to long-term (several minutes up to around 65 minutes) sometimes referred to as “now-casting”.
  • a Deep Learning (DL)-based system forecasts future rain fade using satellite and radar imagery data as well as link power measurements.
  • the DL- based system outperforms current state-of-the-art machine learning-based algorithms in rain fade forecasting in the near and long term.
  • a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions.
  • One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
  • One general aspect includes a method for predicting rain fade for a rain zone using a deep learning system including a computer processor.
  • the method may include: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information includes image datasets including of a cloud view of an Area of Interest (Aol) and a timestamp, and the beacon information includes beacon datasets including a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future.
  • NN Neural Network
  • the geolocation of one or more of the beacon datasets is located within the Aol, a beacon periodicity of the live beacon information is greater than or equal to half (0.5) seconds, and an image periodicity of the live image information is less than or equal to five (5) minutes. Implementations may include one or more of the following features.
  • the method where the image periodicity is different than the beacon periodicity includes using a previous copy of the live beacon information or the live image information.
  • the method where the image periodicity is different than the beacon periodicity includes extrapolating a previous copy of the live beacon information or the live image information as necessary for the matching.
  • the method where the cloud view includes a top-view from a satellite of the Aol or a bottom view from a radar of the Aol or a combination thereof.
  • the method where the live image information includes a radar image of the Aol and a ground truth for the Aol.
  • the method where the live image information includes an image of the Aol from a high-altitude platform or satellite and the image includes images at various spectra.
  • the Aol includes a plurality of Aol
  • the plurality of Aol are located within a rain zone and the evaluating predicts the rain fade for the plurality of Aol.
  • the method may include proactively managing gateway diversity based on the forecasting.
  • a method for predicting rain fade for a rain zone using a deep learning system may include: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information includes image datasets including of a cloud view of an Area of Interest (Aol), a geolocation and a timestamp, and the beacon information includes beacon datasets including a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future.
  • NN Neural Network
  • the geolocation of one or more of the beacon datasets is located within the Aol, the near-future is less than or equal to sixty-five (65) minutes, the beacon information is collected at a satellite transceiver and a geolocation of the satellite transceiver is located within the Aol, the live image information includes an image of the Aol from a satellite, a radar image of the Aol and a ground truth for the Aol, and the NN processes the data using a 3D convolution neutral network.
  • Fig. 1 an exemplary process to preprocess raw data to obtain balanced training data and run-time data according to various embodiments.
  • FIG. 2 illustrates a deep learning system to forecast rain fade according to various embodiments.
  • FIG. 3 illustrates a rain fade forecast method according to various embodiments.
  • Fig. 4 illustrates exemplary beacon measurements for a sample gateway according to various embodiments.
  • FIG. 5A, FIG. 5B, FIG. 5C and FIG. 5D illustrate accuracy, recall, precision and FI recall, respectively, comparing accuracy of the three imagery input scenarios and some prior art models on test data of one RF gateway according to various embodiments.
  • FIG. 6 illustrates Receiver Operating Characteristic (ROC) curve of a long term prediction scenario of the present teachings versus two ML-based models according to various embodiments.
  • ROC Receiver Operating Characteristic
  • FIG. 7 illustrates a confusion matrix of the present teachings when predicting rain fade 60 minutes in the future in various embodiments.
  • the present teachings may be a system, a method, and/or a computer program product at any possible technical detail level of integration
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non- exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • the present teachings forecast precipitation using spatial (radar and/or satellite images) and temporal (power beacon measurements at various frequencies) to predict chances of rain fade.
  • the DL-based system outperforms current state-of-the-art machine learning-based algorithms in rain fade forecasting in the near and long term.
  • Cloud bottom- view image data (for example, radar data with weather condition information) may be more effective for short-term prediction.
  • Cloud top-view image data may be more effective for long-term predictions.
  • a combination of cloud top-view and bottom- view image data may be used to make more effective long-term and short-term predictions.
  • Rain fade refers to the radio signal fade issues caused by rain. The effects of rain fade are more widely seen in higher frequency bands, such as Ka-band, Q-band, V-band and the radio spectrum used by satellite and cellular communication systems.
  • a satellite gateway can connect to a second antenna providing RF terminal (RFT) diversity.
  • RFT may be served by the primary gateway or by a different gateway, namely, a diversity gateway.
  • the system may automatically select and switch between the antennae based on their respective rain fade.
  • a system can predict/anticipate an occurrence of a rain fade, then it can proactively switch between the primary and diversity antennae/gateway to maintain the quality of service.
  • rain fade forecasting enhances RFT gateway diversity switchover and switch back.
  • the link statuses for the links of a RF communication system and the spatial- temporal data from several RF gateways may be used to classify weather into fade or non fade classes.
  • a 3-D convolutional neural network may receive input data.
  • the input data may include cloud top-view images (for example, from the Geostationary Operational Environmental Satellite 16 (GOES- 16)), cloud bottom -view images, and link power data.
  • the DL system extracts necessary features from the input data to forecast rain fade.
  • the present teachings include preprocessing the input data to prepare the data to train the DL system and to predict the rain fade.
  • Continuous weather imagery and monitoring of meteorological and space environment data is available, for example, GOES- 16 for across North America.
  • the data includes advanced imaging with high spatial resolution, for example, 16 spectral channels with a 5-minute scan frequency for accurate forecasts and timely warnings.
  • a live or real time feed and full historical archive of Advanced Baseline Imager (ABI) radiance data (Level lb) is available.
  • ABS Advanced Baseline Imager
  • NWS National Weather Service
  • the DL architecture processes satellite images, radar images and the information about rain fade at gateways such as received power from the satellite at beacons installed at gateways and forecasts rain fade events in the future.
  • the present teachings may be used in satellite communications, cellular communications, and other line-of-sight communication systems, for example, to proactively switch before a rain fade event between diverse satellite gateways, cellular base stations and the like.
  • Beacon data may be collected at a geolocation of a transceiver, for example, a satellite gateway, a cellular base station, or the like.
  • a beacon may a specific signal from a transmitter to a receiver, for example, a satellite to a ground system.
  • a beacon is any transmission signal that is subject to atmospheric weather effects.
  • the present description uses satellite communications for illustration.
  • a satellite communication includes four different links: 1) Gateway to satellite link, 2) Satellite to remote link, 3) Remote to satellite link, and 4) Satellite to gateway link.
  • a satellite transponder includes automatic level control to mitigate rain fades to some level.
  • automatic uplink power control is activated to maintain the predefined received power at the satellite.
  • ACM Adaptive Coding and Modulation
  • the satellite to gateway link is generally mitigated by the large size and gain of a gateway antenna.
  • Fig. 1 an exemplary process to preprocess raw data to obtain balanced training data and run-time data according to various embodiments.
  • An exemplary process 100 may be used to preprocess raw data to obtain balanced training data.
  • the process 100 may preprocess spatial image channels 102, radar images 104 and GW fade data 114 to obtain balanced training data 126.
  • the process 100 may include an operation 106 to harmonize resolutions among the spatial image channels 102, an operation 107 to harmonize the CRS across images, an operation 108 to decompose rain labels of the radar images 104, an operation 110 to extract temporal images for areas of interest, an operation 112 to homogenize input including image, an operation 116 to extract fade events from the GW fade data 114, an operation 114 to match the extracted fade events of operation 116 with the temporal Aol Spatial and Radar images 112, and an operation 120 to balance the quantity of clear sky and rain fade events included in the balanced training data 126.
  • Operation 120 may under-sample clear sky events at sub-operation 122 and over sample rain fade events at sub-operation 124.
  • the resolution for images from different resources may be harmonized to an identical resolution per operation 106.
  • GOES-16 includes images of 16 spectral channels (0.47 pm - 13.3 pm) with a 5- minute sampling rate.
  • the channels have different spatial resolutions varying from 0.000014 to 0.000056 radians in the Geostationary coordinate system reference (CSR). Therefore, either the channels with a higher resolution may be down sampled to match the minimum resolution of the channels or the lower resolution channels may be up sampled to match with the maximum resolution. Up sampling may result in increasing the sizes of the files (and consequently the processing requirements). If the lower resolution images (0.000056 radians) are used, every pixel of the image will cover approximately a 2 Km x 2 Km area on the US map.
  • the coordinate systems for images from different resources may be harmonized per operation 107.
  • the Geostationary CRS of GOES- 16 may be transformed to a Geodetic CRS, a more commonly used CRS.
  • Geodetic CRS describes the location of each gateway in latitude and longitude. This transformation may not be needed for the radar data.
  • Extract Areas of Interest The locations of gateways are Areas of Interest (Aol).
  • Aol Areas of Interest
  • data for square areas centered on Aol may be extracted from the original raw spatial and radar images per operation 110.
  • the resolution of the extracted images depends on the size of a particular Aol. For example, a 32 pixels x 32 pixels image may cover an area of approximately 64 Km x 64 Km.
  • Operation 110 stores the temporal Aol Images 110.
  • the temporal Aol images 110 may include 16 plus 3 channels for each Aol (location of RF gateway or RFT).
  • Decomposing Weather Condition Channels Data from different sources may code precipitation differently.
  • the values of each pixel in the raw data may be decomposed and homogenized, for example, fade labels used by external data may be mapped to fade labels used in the GW fade data 114.
  • some radar data uses values from 0 to 48, where 0 to 16 indicates an intensity of rain, 17 to 32 indicates an intensity of a mixture of snow and rain, and 33 to 48 indicates an intensity of snow.
  • each radar image may be decomposed into channels corresponding to rain, snow, and mix.
  • a mean value of each channel may be subtracted from the pixels of each channel and then divided by the standard deviation of the channel.
  • the mean and the standard deviation of the input channels equal to zero and one respectively.
  • access to all the images may be needed to derive m c and sf
  • a running approach for example, Welford’s online algorithm, rather than accessing all the images may be used to calculate and update the mean and standard deviation values.
  • beacon measurements at gateways and included in the GW fade data 114 may be compared to a rain fade threshold.
  • the system may extract rain fade events per operation 116 and match their time and Aol samples per operation 118 with the temporal Aol images 112.
  • a beacon data sample with a sampling duration for example, 1-minute sampling may be used.
  • a label for the past five minutes, and a label for the future five minutes may be derived.
  • a fixed label for example, 1 may be used to indicate when the minimum beacon value of past or future 5 minutes is less than the rain fade threshold.
  • the minimum beacon value between tl and t2 may be used to define the past label at time instance t2 and the minimum beacon value between t2 and t3 may be used to define the future label at this time instance.
  • the current beacon value and current rain fade status may be used along with the spatial and radar data to improve the system’s accuracy.
  • the sampling rate of spatial and radar data (for example, 5 minutes) may be less often the sampling rate of beacon data (for example, 1 minute).
  • a most recent spatial or radar image is used by the model in between two sampling time steps.
  • the sampling rate of image data for training may be different than the sampling rate of image data in practice.
  • the current rain fade states are extremely imbalanced as less than 1% of the samples may be labeled as rain fade due to the weather condition at these locations. As such, using all of the samples will introduce a bias to the model and will increase the number of false negatives (FN) predictions.
  • a under-sample of the clear weather (no rain fade) samples and oversample of the rain fade samples may be used to balance the number of samples for true (rain fade) and false (clear) cases.
  • To under sample some of the clear sky instances may be periodically dropped for the training.
  • To oversample multiple copies of the rain fade instances may be used for the training. Oversampling the rain fade images and under-sampling the clear sky images balances the ratio between the number of true and false samples.
  • the balanced training data 114 may include spatial images (for example, GOES-16 images), radar images, beacon power levels at Aols around GWs, and rain fade states for each clear sky and rain event used for training.
  • the live/real time data 128 may be evaluated to forecast rain fade events with a trained NN.
  • the process 100 may preprocess live spatial image channels 102, live radar images 104 and live GW fade data 114, with the exception of the balancing operation 120, to forecast rain fade events in near real-time (within a few seconds).
  • the forecasts may be used by to manage gateway diversity, for example, as illustrated by a rain fade forecast method 300 of FIG. 3.
  • FIG. 2 illustrates a deep learning system to forecast rain fade according to various embodiments.
  • a deep learning system 200 to forecast rain casts may include a hierarchy of neural network computation layers.
  • column 1 identifies aNN type
  • column 2 lists exemplary parameters/environment for the NN
  • column 3 lists the NN output and a format of the NN output.
  • the system 200 includes the NNs identified in column 1.
  • the system 200 may invoke the identified NNs in the sequence detailed in FIG. 2.
  • the system 200 includes a pre-processor to harmonize and homogenize raw data from various resources and produce the balanced data as described above.
  • the system 200 may use multiple layers (204, 208, 212, 216) of a 3D CNN to capture the spatio-temporal interdependencies of the spatial and radar images.
  • a Long-Short Term Memory (LSTM)-2D CNN may be used in the DL system 200. While a 2D CNN may extract spatial features from an input image, a 3D CNN (or an LSTM-CNN) block can learn the temporal relationship between the input images. In some embodiments a 2D CNN may be used instead of a 3D CNN in the multiple layers (204, 208, 212, 216).
  • a first layer 204 may extract the interdependencies between the channels and the second layer 208or later layers 212, 216 may find the rainy weather forecasting features of the images.
  • a pooling layer (206, 210, 214, 218) may be used to reduce the size of the input.
  • the CNN may use non-linear rectifier (RELU) activation as specified in the second column of FIG. 2.
  • RELU non-linear rectifier
  • One of the pooling layers of the CNN may include a flattening functionality to flatten a 3D (or 2D) input into a ID output, for example, after last CNN layer 216.
  • pooling layer 218 may include the flattening functionality.
  • One of the multiple layers of the CNN may include a dense layer for learning the relationship between the input images and the probability of the rain fades, for example, after last CNN layer 216.
  • pooling layer 218 may be include a dense layer.
  • An activation function of the last layer, for example, layer 218, may map the output of the dense layer to a probability value between 0 and 1.
  • the final layer’s activation function may be chosen to be a softmax layer 220.
  • ground information may be attached to them.
  • the ground information may include GW locations, a current rain fade state of each GW for each input sample interval, and one or more current beacon measurements at each GW.
  • the ground information may be integrated by adding a gateway channel and a beacon channel to the image data.
  • the gateway channel may include the ground information for all the Aol or gateways of a rain zone.
  • the beacon channel may include the ground information for all the beacons in the Aol or gateways of a rain zone. Coverage areas may be separated into rain zones per their expected rain patterns.
  • the gateway channel or the beacon channel may use a matrix to convey the ground information.
  • the ground information may be integrated by adding extra channels to the image data.
  • a one-hot encoding for each GW meaning that for n g number of GWs n g extra channels are added). All the pixels of the n g GW channels may have a zero value except one for one channel when spatial and/or radar images are for the i-th GW. This input allows multiple gateways to share the same prediction model.
  • n g extra channels may be added to indicate when the i-th GW is in rain fade, for example, by setting all pixels in the rain fade channels to +ls if the current state of the GW is rain fade -Is otherwise.
  • historical beacon data for each gateway may bucketize the beacon measurements into n 3 ⁇ 4 buckets such that each bucket has approximately equal number of samples. For each bucket two values that define the two ends of the bucket may be used.
  • the h 3 ⁇ 4 extra channels may be considered when the current beacon value falls into the i- th bucket the i-th channel may be defined as Is and the other channels as -1 (one hot encoding).
  • n GOES and n radar may be the number of channels from GOES- 16 and radar sources
  • the system may at every time step have n GOES +n radar +n g +n 3 ⁇ 4 +1 channels.
  • the number channels are chosen to be n GOES + n g + n 3 ⁇ 4 + 1 or n radar + n g + n 3 ⁇ 4 + 1.
  • the input data for each time step will have a 32 x 32 x (nGOES + n radar + n g + n3 ⁇ 4 + 1) size where 32 is the number of pixels in each direction of the GOES-16 and radar images.
  • the images of the multiple steps in the past may be fed to the 3D CNN to capture the temporal behavior of the input images.
  • input sample to the CNN may have a n p x 32 x 32 x (n GOES + n radar + n g + n 3 ⁇ 4 + 1) shape.
  • the data may be split into a training set and a test set.
  • the training set may include the first 80% of the preprocessed data, while the remaining 20% may be kept for testing the model.
  • the under-sampling and over sampling steps of the preprocessing are done only on the training set.
  • the trained version of the system 200 may be used to evaluate future unseen samples, for example, in near real-time, by rain zones.
  • FIG. 3 illustrates a rain fade forecast method according to various embodiments.
  • a rain fade forecast method 300 may include an operation 302 to divide a coverage area into rain zones per there expected rain patterns.
  • United States rainfall climatology may generally be described as having the following rain zones.
  • the eastern part of the contiguous United States east of the 98th meridian, the mountains of the Pacific Northwest, the Willamette Valley, and the Sierra Nevada range are the wetter portions of the nation, with average rainfall exceeding 30 inches (760 mm) per year.
  • the drier areas are the Desert Southwest, Great Basin, valleys of northeast Arizona, eastern Utah, and central Wyoming. Increased warming within urban heat islands leads to an increase in rainfall downwind of cities.
  • the rain zones of the present teachings maybe defined along climatology rainfall zones, may merge climatology rainfall zones, or may subdivide climatology rainfall zones. The defining of the rainfall zones may be done of logistical reasons by a network operator.
  • the rain fade forecast method 300 may include operation 310 to provision a rain zone forecaster.
  • the provisioning 310 may include an operation 312 to identify Aol in the rain zone.
  • the provisioning 310 may include an operation 314 to pre-process training data for the rain zone. Exemplary pre-processing of operation 314 may be performed per FIG. 1.
  • the provisioning 310 may include an operation 316 to train a NN for the Aol in a rain zone.
  • the NN may be a system of FIG. 2.
  • the provisioning 310 may include an operation 318 to generate a rain zone forecaster.
  • the rain zone forecaster includes the NN after training. In the rain zone forecaster, further learning by the NN when evaluating live/real-time/non-training/test raw data may be disabled.
  • the provisioning 310 may include an operation 320 to deploy a rain zone forecaster for each rain zone in a coverage area. The one or more rain zone forecasters may be deployed in a Network Operations Center.
  • the rain fade forecast method 300 may include operation 330 to manage GW diversity.
  • the managing operation 330 may include operation 332 to collect evaluation raw data, for example, satellite images, radar images, gateway and beacon measurements.
  • the managing operation 330 may include operation 334 to pre-process the evaluation raw data.
  • the pre-processing may skip a balance training data operation, for example, operation 120 of FIG. 1.
  • the managing operation 330 may include operation 336 to forecast rain fade for all or some of the geolocations of beacons included in the evaluation data.
  • evaluation data may be used for forecasting by one or more DL systems.
  • particularized data stream s/channels may be established for each rain zone.
  • the managing operation 330 may include operation 338 to notify a diversity controller of predicted rain fade.
  • the notifications may be classified by imminency of expected rain fade, for example, within 1 minute, within 5 minutes, within 30 minutes, within an hour or the like.
  • the managing operation 330 may include operation 340 to replace, prior to rain fade occurring, a primary GW with an available diversity GW not subject to rain fade.
  • the notifications may be used to schedule diversity GW usage, notify an Network Operations Center, notify subscribers and the like.
  • recall - ( TP+FN ) .
  • the Fl-score evaluates the model during the training phase to find a model that has both good precision and recall rates.
  • the data is labeled by aggregating beacon measurements of each gateway and using a weighted averaging to derive the clear sky threshold for each time step (for example, clear sky threshold 404).
  • the beacon measurements of each day are compared to this threshold.
  • Fig. 4 illustrates exemplary beacon measurements for a sample gateway according to various embodiments.
  • a beacon measurement chart 400 illustrates a current beacon value 402 (in decibels) and rain fade instances 406 recorded by a GW over time.
  • a clear sky threshold 404 for adequate link performance is also illustrated. In some embodiments, the clear sky threshold 404 may vary.
  • the illustrated beacon measurements, clear sky threshold, and rain fade cases are for a single gateway.
  • a DL system was provided input imagery (radar and satellite) from past 30 minutes.
  • the DL system correctly predicted rain fade in 60 minutes in the future.
  • the DL system may predict a long-term rain fade event, for example, as far as 60 minutes in the future.
  • a DL system may be trained for different target future time predictions, for example, from 5 minutes to 65 minutes into the future.
  • the DL system was trained on three imagery input scenarios: a) satellite (GOES- 16) only, b) radar only, and c) satellite and radar together. Input to all three input scenarios also included beacon data information.
  • FIG. 5A, FIG. 5B, FIG. 5C and FIG. 5D illustrate accuracy, recall, precision and FI recall, respectively, comparing accuracy of three imagery input scenarios and some prior art models on test data of one RF gateway according to various embodiments.
  • FIG. 5A illustrates an accuracy plot 500 plotting prediction accuracy made using (a) radar and beacon information 502, (b) GOES- 16 and beacon information 504, (c) radar GOES-16 and beacon information 506, (d) SVM model (prior art) 508 and (e) MLP model (prior art) 509.
  • FIG. 5B illustrates a recall plot 510 plotting prediction recall made using (a) radar and beacon information 512, (b) GOES-16 and beacon information 514, (c) radar GOES- 16 and beacon information 516, (d) SVM model (prior art) 518 and (e) MLP model (prior art) 519.
  • FIG. 5C illustrates a precision plot 530 plotting prediction precision made using (a) radar and beacon information 532, (b) GOES-16 and beacon information 534, (c) radar GOES-16 and beacon information 536, (d) SVM model (prior art) 538 and (e) MLP model (prior art) 539.
  • FIG. 5D illustrates a FI -score plot 540 plotting prediction precision made using (a) radar and beacon information 542, (b) GOES- 16 and beacon information 544, (c) radar GOES-16 and beacon information 546, (d) SVM model (prior art) 548 and (e) MLP model (prior art) 549.
  • the FI -score of the present teachings outperforms the Fl- scores of the prior art teachings.
  • a DL system trained with radar and beacon information 542 only outperforms the other scenarios for short term forecasting in terms of fl -score.
  • the DL system trained only on GOES- 16 and beacon information 544 outperforms the other scenarios in long term forecasting. Without limitation, this may be because the GOES- 16 images track the movements of the clouds while the radar images have the weather condition records. Thus, for a short-term prediction radar data is more effective while for a long-term prediction the GOES- 16 data is more effective.
  • the performance of the present teachings outperform the other state-of-the art ML models especially for long term predictions.
  • the prior art systems are ML-based rain fade prediction models that only use time series data.
  • the beacon information was used as the time series input for the MLP model 549 (Multi-Layer Perceptron) and the SVM model 548 (Support Vector Machine).
  • FIG. 6 illustrates Receiver Operating Characteristic (ROC) curve of a long term prediction scenario of the present teachings versus two ML-based models according to various embodiments.
  • ROC Receiver Operating Characteristic
  • a ROC curve depicts a trade-off between the TP rate (TPR) and the FP rate (FPR) by plotting TPR versus FPR at various thresholds. Lowering the classification threshold causes more observations to be classified as positive, increasing the TP rate.
  • a ROC curve 602 of the DL system is closer to the top left of the graph and achieves a high TPR while maintaining a low FPR.
  • the ROC curve 604 for a MLP classifier (in particular) and the ROC curve 606 for a SVM classifier illustrates that the two prior art classifiers cannot well distinguish between the two classes.
  • An Area Under the ROC Curve measures performance across all possible classification thresholds.
  • the ROC curve 602 of the present teachings has a higher AUC than the prior art ROC curves 604, 606.
  • the AUC of the ROC curve 602 implies that the DL system of the present teachings better predicts the probability of rain fade than the probability of clear sky.
  • FIG. 7 illustrates a confusion matrix of the present teachings when predicting rain fade 60 minutes in the future in various embodiments.

Abstract

Predicting rain fade for a rain zone using a deep learning system may include: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information includes image datasets including of a cloud view of an Area of Interest (AoI), a geolocation and a timestamp, and the beacon information includes beacon datasets including a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the AoI from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future. The geolocation of one or more of the beacon datasets is located within the AoI, and the periodicity of the live beacon information and the live image information is less than or equal to five (5) minutes.

Description

DEEP LEARNING FOR RAIN FADE PREDICTION IN SATELLITE
COMMUNICATIONS
FIELD
[0001] A deep learning (DL)-based system and method to forecast future rain fade using raw data including images and link power measurements is disclosed. The images may include cloud movements imagery in various spectra from one or more viewpoints. The spectra includes radar, infra-red, radio, ultra-violet and others. The viewpoints may include cloud top-view and cloud bottom-view imagery. For example, the cloud top-view images may be from a fixed or moving satellite, or a high altitude platform. The cloud bottom-view images may be radar images from the ground. Some of the images may include ground conditions, for example, radar images. A gateway diversity strategy utilizing rain fade forecasting improves weather-resiliency and enhances overall network availability. The predictions may predict rain fade for short-term (seconds) to long-term (several minutes up to around 65 minutes) sometimes referred to as “now-casting”.
BACKGROUND
[0002] In the prior art, empirical, statistical, and fade slope models can predict rain fade to some extent. However, they typically require statistical measurements of rain characteristics in a given area and cannot be generalized to a large-scale system. Furthermore, such models typically predict near-future rain fade events but are incapable of forecasting far into the future, making proactive resource management more difficult.
SUMMARY
[0003] This Summary is provided to introduce a selection of concepts in a simplified form that is further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0004] In the present teachings, a Deep Learning (DL)-based system forecasts future rain fade using satellite and radar imagery data as well as link power measurements. The DL- based system outperforms current state-of-the-art machine learning-based algorithms in rain fade forecasting in the near and long term.
[0005] A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a method for predicting rain fade for a rain zone using a deep learning system including a computer processor. The method may include: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information includes image datasets including of a cloud view of an Area of Interest (Aol) and a timestamp, and the beacon information includes beacon datasets including a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future. In the method, the geolocation of one or more of the beacon datasets is located within the Aol, a beacon periodicity of the live beacon information is greater than or equal to half (0.5) seconds, and an image periodicity of the live image information is less than or equal to five (5) minutes. Implementations may include one or more of the following features.
[0006] The method where the near-future is less than or equal to sixty-five (65) minutes.
[0007] The method where the image periodicity is different than the beacon periodicity, and the method includes using a previous copy of the live beacon information or the live image information.
[0008] The method where the image periodicity is different than the beacon periodicity, and the method includes extrapolating a previous copy of the live beacon information or the live image information as necessary for the matching.
[0009] The method where the beacon information is collected at a satellite transceiver and a geolocation of the satellite transceiver is located within the Aol.
[0010] The method where the cloud view includes a top-view from a satellite of the Aol or a bottom view from a radar of the Aol or a combination thereof.
[0011] The method where the live image information includes a radar image of the Aol and a ground truth for the Aol.
[0012] The method where the ground truth includes a current rain state and the pre processing harmonizes the rain labels with current rain fade states of the beacon information.
[0013] The method where the live image information includes an image of the Aol from a high-altitude platform or satellite and the image includes images at various spectra.
[0014] The method where the pre-processing harmonizes the live image information to an image resolution. [0015] The method where the pre-processing harmonizes a coordinate system of the live image information and the live beacon information.
[0016] The method where the training set balances a quantity of clear sky events as compared to a quantify of rain fade events.
[0017] The method where the NN processes the data using a 3D convolution neutral network.
[0018] The method where the NN successively processes the data using a 3D convolution NN, a max pool, a flattening NN and a softmax NN.
[0019] The method where the Aol covers a ground area of at least 32 km X 32 km.
[0020] The method where the Aol is centered over the geolocation of one or more of the beacon datasets.
[0021] The method where the Aol includes a plurality of Aol, the plurality of Aol are located within a rain zone and the evaluating predicts the rain fade for the plurality of Aol.
[0022] The method may include proactively managing gateway diversity based on the forecasting.
[0023] A method for predicting rain fade for a rain zone using a deep learning system may include: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information includes image datasets including of a cloud view of an Area of Interest (Aol), a geolocation and a timestamp, and the beacon information includes beacon datasets including a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future. In the method, the geolocation of one or more of the beacon datasets is located within the Aol, the near-future is less than or equal to sixty-five (65) minutes, the beacon information is collected at a satellite transceiver and a geolocation of the satellite transceiver is located within the Aol, the live image information includes an image of the Aol from a satellite, a radar image of the Aol and a ground truth for the Aol, and the NN processes the data using a 3D convolution neutral network.
[0024] The method where implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium. Additional features will be set forth in the description that follows, and in part will be apparent from the description, or may be learned by practice of what is described. DRAWINGS
[0025] In order to describe the manner in which the above-recited and other advantages and features may be obtained, a more particular description is provided below and will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments and are not, therefore, to be limiting of its scope, implementations will be described and explained with additional specificity and detail with the accompanying drawings.
[0026] Fig. 1 an exemplary process to preprocess raw data to obtain balanced training data and run-time data according to various embodiments.
[0027] FIG. 2 illustrates a deep learning system to forecast rain fade according to various embodiments.
[0028] FIG. 3 illustrates a rain fade forecast method according to various embodiments.
[0029] Fig. 4 illustrates exemplary beacon measurements for a sample gateway according to various embodiments.
[0030] FIG. 5A, FIG. 5B, FIG. 5C and FIG. 5D illustrate accuracy, recall, precision and FI recall, respectively, comparing accuracy of the three imagery input scenarios and some prior art models on test data of one RF gateway according to various embodiments.
[0031] FIG. 6 illustrates Receiver Operating Characteristic (ROC) curve of a long term prediction scenario of the present teachings versus two ML-based models according to various embodiments.
[0032] FIG. 7 illustrates a confusion matrix of the present teachings when predicting rain fade 60 minutes in the future in various embodiments.
[0033] Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
DETAILED DESCRIPTION
[0034] The present teachings may be a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
[0035] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non- exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[0036] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
[0037] Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
[0038] Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
[0039] These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
[0040] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0041] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
[0042] Reference in the specification to "one embodiment" or "an embodiment" of the present invention, as well as other variations thereof, means that a feature, structure, characteristic, and so forth described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrase "in one embodiment" or "in an embodiment", as well any other variations, appearing in various places throughout the specification are not necessarily all referring to the same embodiment.
INTRODUCTION
[0043] The present teachings forecast precipitation using spatial (radar and/or satellite images) and temporal (power beacon measurements at various frequencies) to predict chances of rain fade. The DL-based system outperforms current state-of-the-art machine learning-based algorithms in rain fade forecasting in the near and long term. Cloud bottom- view image data (for example, radar data with weather condition information) may be more effective for short-term prediction. Cloud top-view image data may be more effective for long-term predictions. In some embodiment, a combination of cloud top-view and bottom- view image data may be used to make more effective long-term and short-term predictions. Rain fade refers to the radio signal fade issues caused by rain. The effects of rain fade are more widely seen in higher frequency bands, such as Ka-band, Q-band, V-band and the radio spectrum used by satellite and cellular communication systems.
[0044] For ground Radio Frequency (RF) gateway locations (primary gateway) subject to high rain fade, a satellite gateway can connect to a second antenna providing RF terminal (RFT) diversity. The RFT may be served by the primary gateway or by a different gateway, namely, a diversity gateway. The system may automatically select and switch between the antennae based on their respective rain fade. When a system can predict/anticipate an occurrence of a rain fade, then it can proactively switch between the primary and diversity antennae/gateway to maintain the quality of service. Hence, rain fade forecasting enhances RFT gateway diversity switchover and switch back.
[0045] The link statuses for the links of a RF communication system and the spatial- temporal data from several RF gateways may be used to classify weather into fade or non fade classes. In some embodiments, a 3-D convolutional neural network (CNN) may receive input data. The input data may include cloud top-view images (for example, from the Geostationary Operational Environmental Satellite 16 (GOES- 16)), cloud bottom -view images, and link power data. The DL system extracts necessary features from the input data to forecast rain fade. The present teachings include preprocessing the input data to prepare the data to train the DL system and to predict the rain fade.
[0046] Continuous weather imagery and monitoring of meteorological and space environment data is available, for example, GOES- 16 for across North America. The data includes advanced imaging with high spatial resolution, for example, 16 spectral channels with a 5-minute scan frequency for accurate forecasts and timely warnings. A live or real time feed and full historical archive of Advanced Baseline Imager (ABI) radiance data (Level lb) is available. In addition, a 1 km x 1 km resolution mosaic of National Weather Service (NWS) radar reflectivity activity as images, with a 5-minute scan frequency, is available.
The DL architecture processes satellite images, radar images and the information about rain fade at gateways such as received power from the satellite at beacons installed at gateways and forecasts rain fade events in the future.
[0047] The present teachings may be used in satellite communications, cellular communications, and other line-of-sight communication systems, for example, to proactively switch before a rain fade event between diverse satellite gateways, cellular base stations and the like. Beacon data may be collected at a geolocation of a transceiver, for example, a satellite gateway, a cellular base station, or the like. A beacon may a specific signal from a transmitter to a receiver, for example, a satellite to a ground system. In some embodiments, a beacon is any transmission signal that is subject to atmospheric weather effects. The present description uses satellite communications for illustration.
SATELLITE COMMUNICATIONS
[0048] A satellite communication includes four different links: 1) Gateway to satellite link, 2) Satellite to remote link, 3) Remote to satellite link, and 4) Satellite to gateway link. For each of these links, different implementations may be used to mitigate the rain fade. For the gateway to satellite link, a satellite transponder includes automatic level control to mitigate rain fades to some level. In case of heavy rain fade, automatic uplink power control is activated to maintain the predefined received power at the satellite. To mitigate the rain fade effect on satellite to remote and remote to satellite links, Adaptive Coding and Modulation (ACM) and adaptive inroute selection may be used. The satellite to gateway link is generally mitigated by the large size and gain of a gateway antenna.
PREPROCESSING OF TRAINING DATA
[0049] Fig. 1 an exemplary process to preprocess raw data to obtain balanced training data and run-time data according to various embodiments.
[0050] An exemplary process 100 may be used to preprocess raw data to obtain balanced training data. The process 100 may preprocess spatial image channels 102, radar images 104 and GW fade data 114 to obtain balanced training data 126. The process 100 may include an operation 106 to harmonize resolutions among the spatial image channels 102, an operation 107 to harmonize the CRS across images, an operation 108 to decompose rain labels of the radar images 104, an operation 110 to extract temporal images for areas of interest, an operation 112 to homogenize input including image, an operation 116 to extract fade events from the GW fade data 114, an operation 114 to match the extracted fade events of operation 116 with the temporal Aol Spatial and Radar images 112, and an operation 120 to balance the quantity of clear sky and rain fade events included in the balanced training data 126. Operation 120 may under-sample clear sky events at sub-operation 122 and over sample rain fade events at sub-operation 124.
[0051] Harmonize Resolution : In some embodiments, the resolution for images from different resources may be harmonized to an identical resolution per operation 106. For example, GOES-16 includes images of 16 spectral channels (0.47 pm - 13.3 pm) with a 5- minute sampling rate. However, there are some problems with this raw data that need to be addressed. As, the channels have different spatial resolutions varying from 0.000014 to 0.000056 radians in the Geostationary coordinate system reference (CSR). Therefore, either the channels with a higher resolution may be down sampled to match the minimum resolution of the channels or the lower resolution channels may be up sampled to match with the maximum resolution. Up sampling may result in increasing the sizes of the files (and consequently the processing requirements). If the lower resolution images (0.000056 radians) are used, every pixel of the image will cover approximately a 2 Km x 2 Km area on the US map.
[0052] Harmonize CRS: In some embodiments, the coordinate systems for images from different resources may be harmonized per operation 107. For example, the Geostationary CRS of GOES- 16 may be transformed to a Geodetic CRS, a more commonly used CRS. Geodetic CRS describes the location of each gateway in latitude and longitude. This transformation may not be needed for the radar data.
[0053] Extract Areas of Interest: The locations of gateways are Areas of Interest (Aol). In some embodiments, data for square areas centered on Aol may be extracted from the original raw spatial and radar images per operation 110. The resolution of the extracted images depends on the size of a particular Aol. For example, a 32 pixels x 32 pixels image may cover an area of approximately 64 Km x 64 Km. Operation 110 stores the temporal Aol Images 110. In the example of GOES-16 data and radar data the temporal Aol images 110 may include 16 plus 3 channels for each Aol (location of RF gateway or RFT).
[0054] Decomposing Weather Condition Channels : Data from different sources may code precipitation differently. The values of each pixel in the raw data may be decomposed and homogenized, for example, fade labels used by external data may be mapped to fade labels used in the GW fade data 114. For example, some radar data uses values from 0 to 48, where 0 to 16 indicates an intensity of rain, 17 to 32 indicates an intensity of a mixture of snow and rain, and 33 to 48 indicates an intensity of snow. As such, each radar image may be decomposed into channels corresponding to rain, snow, and mix.
[0055] Homogenize input : In some embodiments, a mean value of each channel may be subtracted from the pixels of each channel and then divided by the standard deviation of the channel. As such, the mean and the standard deviation of the input channels equal to zero and one respectively. Formally, if pfj to be a pixel of an image from channel c located at the pfj-mc i- th row and the /-th column, then the homogenized pixel will be pi;- = — ^ — , where mc and sc are the sample mean and the sample standard deviation values of channel c. In some embodiments, access to all the images may be needed to derive mc and sf In some embodiments, a running approach, for example, Welford’s online algorithm, rather than accessing all the images may be used to calculate and update the mean and standard deviation values.
[0056] Ground Truth Extraction : For ground truth, beacon measurements at gateways and included in the GW fade data 114 may be compared to a rain fade threshold. The system may extract rain fade events per operation 116 and match their time and Aol samples per operation 118 with the temporal Aol images 112. During training, a beacon data sample with a sampling duration, for example, 1-minute sampling may be used. For each time sample a minimum beacon value within the past five minutes, a label for the past five minutes, and a label for the future five minutes may be derived. For past or future labels, a fixed label (for example, 1) may be used to indicate when the minimum beacon value of past or future 5 minutes is less than the rain fade threshold. For instance, three consecutive sampling time instances, namely tl, t2, and t3, then the minimum beacon value between tl and t2 may be used to define the past label at time instance t2 and the minimum beacon value between t2 and t3 may be used to define the future label at this time instance. The resulting sample and ground truth from the past 5 minute the ’’current beacon value” and ’’current rainfade status” and the resulting label for the future 5 minutes the ’’target label”. The current beacon value and current rain fade status may be used along with the spatial and radar data to improve the system’s accuracy. The sampling rate of spatial and radar data (for example, 5 minutes) may be less often the sampling rate of beacon data (for example, 1 minute). A most recent spatial or radar image is used by the model in between two sampling time steps. The sampling rate of image data for training may be different than the sampling rate of image data in practice.
[0057] Balance data : The current rain fade states are extremely imbalanced as less than 1% of the samples may be labeled as rain fade due to the weather condition at these locations. As such, using all of the samples will introduce a bias to the model and will increase the number of false negatives (FN) predictions. A under-sample of the clear weather (no rain fade) samples and oversample of the rain fade samples may be used to balance the number of samples for true (rain fade) and false (clear) cases. To under sample, some of the clear sky instances may be periodically dropped for the training. To oversample, multiple copies of the rain fade instances may be used for the training. Oversampling the rain fade images and under-sampling the clear sky images balances the ratio between the number of true and false samples.
[0058] In some embodiments, the balanced training data 114 may include spatial images (for example, GOES-16 images), radar images, beacon power levels at Aols around GWs, and rain fade states for each clear sky and rain event used for training.
[0059] When processing live raw data, after the matching operation 118, the live/real time data 128 may be evaluated to forecast rain fade events with a trained NN. As such, the process 100 may preprocess live spatial image channels 102, live radar images 104 and live GW fade data 114, with the exception of the balancing operation 120, to forecast rain fade events in near real-time (within a few seconds). The forecasts may be used by to manage gateway diversity, for example, as illustrated by a rain fade forecast method 300 of FIG. 3.
DEEP LEARNING ARCHITECTURE
[0060] FIG. 2 illustrates a deep learning system to forecast rain fade according to various embodiments.
[0061] A deep learning system 200 to forecast rain casts may include a hierarchy of neural network computation layers. In FIG. 2 column 1 identifies aNN type, column 2 lists exemplary parameters/environment for the NN, and column 3 lists the NN output and a format of the NN output. The system 200 includes the NNs identified in column 1. The system 200 may invoke the identified NNs in the sequence detailed in FIG. 2. The system 200 includes a pre-processor to harmonize and homogenize raw data from various resources and produce the balanced data as described above.
[0062] As shown in Fig. 2, the system 200 may use multiple layers (204, 208, 212, 216) of a 3D CNN to capture the spatio-temporal interdependencies of the spatial and radar images. In some embodiments, a Long-Short Term Memory (LSTM)-2D CNN may be used in the DL system 200. While a 2D CNN may extract spatial features from an input image, a 3D CNN (or an LSTM-CNN) block can learn the temporal relationship between the input images. In some embodiments a 2D CNN may be used instead of a 3D CNN in the multiple layers (204, 208, 212, 216). For example, a first layer 204 may extract the interdependencies between the channels and the second layer 208or later layers 212, 216 may find the rainy weather forecasting features of the images. After every CNN layer, a pooling layer (206, 210, 214, 218) may be used to reduce the size of the input. The CNN may use non-linear rectifier (RELU) activation as specified in the second column of FIG. 2.
[0063] One of the pooling layers of the CNN may include a flattening functionality to flatten a 3D (or 2D) input into a ID output, for example, after last CNN layer 216. In FIG. 2, pooling layer 218 may include the flattening functionality. One of the multiple layers of the CNN may include a dense layer for learning the relationship between the input images and the probability of the rain fades, for example, after last CNN layer 216. In FIG. 2, pooling layer 218 may be include a dense layer. An activation function of the last layer, for example, layer 218, may map the output of the dense layer to a probability value between 0 and 1. The final layer’s activation function may be chosen to be a softmax layer 220.
PREPARATION OF THE INPUT DATA FOR THE DL MODEL.
[0064] Although spatial and radar images are the main sources of input for training the DL model, ground information may be attached to them. The ground information may include GW locations, a current rain fade state of each GW for each input sample interval, and one or more current beacon measurements at each GW.
[0065] In some embodiments, the ground information may be integrated by adding a gateway channel and a beacon channel to the image data. The gateway channel may include the ground information for all the Aol or gateways of a rain zone. The beacon channel may include the ground information for all the beacons in the Aol or gateways of a rain zone. Coverage areas may be separated into rain zones per their expected rain patterns. The gateway channel or the beacon channel may use a matrix to convey the ground information. Thus, in this embodiment, letting np be the number of samples from past, then input sample to the CNN may have a np x 32 x 32 x (TLGOES + nradar + 1 + 1 + 1) shape.
[0066] In some embodiments, the ground information may be integrated by adding extra channels to the image data. For the GW locations, a one-hot encoding for each GW (meaning that for ng number of GWs ng extra channels are added). All the pixels of the ng GW channels may have a zero value except one for one channel when spatial and/or radar images are for the i-th GW. This input allows multiple gateways to share the same prediction model. ng extra channels may be added to indicate when the i-th GW is in rain fade, for example, by setting all pixels in the rain fade channels to +ls if the current state of the GW is rain fade -Is otherwise. These rain fade channels provide the ground truth about the rainfade of the gateway in the recent past for the given gateway at the given time. In some embodiments, historical beacon data for each gateway may bucketize the beacon measurements into n¾ buckets such that each bucket has approximately equal number of samples. For each bucket two values that define the two ends of the bucket may be used.
Then the h¾ extra channels may be considered when the current beacon value falls into the i- th bucket the i-th channel may be defined as Is and the other channels as -1 (one hot encoding).
[0067] Thus, considering nGOES and nradar to be the number of channels from GOES- 16 and radar sources, in some embodiments the system may at every time step have nGOES +nradar +ng +n¾ +1 channels. In some embodiments, only GOES-16 or radar data is fed to the model, the number channels are chosen to be nGOES + ng + n¾ + 1 or nradar + ng + n¾ + 1. Next, the input data for each time step will have a 32 x 32 x (nGOES + nradar + ng + n¾ + 1) size where 32 is the number of pixels in each direction of the GOES-16 and radar images. In addition, the images of the multiple steps in the past may be fed to the 3D CNN to capture the temporal behavior of the input images. Thus, in this embodiments, letting np be the number of samples from past, then input sample to the CNN may have a np x 32 x 32 x (nGOES + nradar + ng + n¾ + 1) shape.
[0068] To train the model the data may be split into a training set and a test set. The training set may include the first 80% of the preprocessed data, while the remaining 20% may be kept for testing the model. The under-sampling and over sampling steps of the preprocessing (operation 120) are done only on the training set. The trained version of the system 200 may be used to evaluate future unseen samples, for example, in near real-time, by rain zones.
RAIN FADE FORECAST
[0069] FIG. 3 illustrates a rain fade forecast method according to various embodiments.
[0070] A rain fade forecast method 300 may include an operation 302 to divide a coverage area into rain zones per there expected rain patterns. For example, United States rainfall climatology may generally be described as having the following rain zones. The eastern part of the contiguous United States east of the 98th meridian, the mountains of the Pacific Northwest, the Willamette Valley, and the Sierra Nevada range are the wetter portions of the nation, with average rainfall exceeding 30 inches (760 mm) per year. The drier areas are the Desert Southwest, Great Basin, valleys of northeast Arizona, eastern Utah, and central Wyoming. Increased warming within urban heat islands leads to an increase in rainfall downwind of cities. The rain zones of the present teachings maybe defined along climatology rainfall zones, may merge climatology rainfall zones, or may subdivide climatology rainfall zones. The defining of the rainfall zones may be done of logistical reasons by a network operator.
[0071] The rain fade forecast method 300 may include operation 310 to provision a rain zone forecaster. The provisioning 310 may include an operation 312 to identify Aol in the rain zone. The provisioning 310 may include an operation 314 to pre-process training data for the rain zone. Exemplary pre-processing of operation 314 may be performed per FIG. 1. The provisioning 310 may include an operation 316 to train a NN for the Aol in a rain zone. The NN may be a system of FIG. 2. The provisioning 310 may include an operation 318 to generate a rain zone forecaster. The rain zone forecaster includes the NN after training. In the rain zone forecaster, further learning by the NN when evaluating live/real-time/non-training/test raw data may be disabled. The provisioning 310 may include an operation 320 to deploy a rain zone forecaster for each rain zone in a coverage area. The one or more rain zone forecasters may be deployed in a Network Operations Center.
[0072] The rain fade forecast method 300 may include operation 330 to manage GW diversity. The managing operation 330 may include operation 332 to collect evaluation raw data, for example, satellite images, radar images, gateway and beacon measurements. The managing operation 330 may include operation 334 to pre-process the evaluation raw data. The pre-processing may skip a balance training data operation, for example, operation 120 of FIG. 1. The managing operation 330 may include operation 336 to forecast rain fade for all or some of the geolocations of beacons included in the evaluation data. When DL systems are deployed per rain zone, evaluation data may be used for forecasting by one or more DL systems. In some embodiments, particularized data stream s/channels may be established for each rain zone. The managing operation 330 may include operation 338 to notify a diversity controller of predicted rain fade. The notifications may be classified by imminency of expected rain fade, for example, within 1 minute, within 5 minutes, within 30 minutes, within an hour or the like. The managing operation 330 may include operation 340 to replace, prior to rain fade occurring, a primary GW with an available diversity GW not subject to rain fade. In some embodiments, the notifications may be used to schedule diversity GW usage, notify an Network Operations Center, notify subscribers and the like.
EXPERIMENTAL RESULTS - Evaluation metrics [0073] Four exemplary terminologies may be used to evaluate the performance of the model:
• True-positive (TP): A rain fade event correctly classified as rain fade.
• False-positive (FP): A clear sky event incorrectly classified as rain fade.
• True-negative (TN): A clear sky event correctly classified as clear sky.
• False-negative (FN): A rain fade event incorrectly classified as clear sky. [0074] Exemplary evaluation metrics may be used. Closeness of predictions to their actual labels may be defined as accuracJy = Fraction of TP instances
Figure imgf000017_0001
among the positive instances predicted by the model may be defined as precision =
TP
(TP+FPy raction of TP instances among the actual (ground truth) positive instances and
TP may J be defined as: recall = - ( TP+FN ) . A harmonic mean of p ^recision and recall which
TP allows us to combine these two metrics that may J be defined as FI = - ( -TP+FP -) The Fl-score evaluates the model during the training phase to find a model that has both good precision and recall rates.
EXPERIMENTAL RESULTS - Dataset [0075] To evaluate the model, the data is labeled by aggregating beacon measurements of each gateway and using a weighted averaging to derive the clear sky threshold for each time step (for example, clear sky threshold 404). The beacon measurements of each day are compared to this threshold.
[0076] Fig. 4 illustrates exemplary beacon measurements for a sample gateway according to various embodiments.
[0077] A beacon measurement chart 400 illustrates a current beacon value 402 (in decibels) and rain fade instances 406 recorded by a GW over time. A clear sky threshold 404 for adequate link performance is also illustrated. In some embodiments, the clear sky threshold 404 may vary. The illustrated beacon measurements, clear sky threshold, and rain fade cases are for a single gateway.
EXPERIMENTAL RESULTS - Experiments
[0078] A DL system was provided input imagery (radar and satellite) from past 30 minutes. The DL system correctly predicted rain fade in 60 minutes in the future. The DL system may predict a long-term rain fade event, for example, as far as 60 minutes in the future. A DL system may be trained for different target future time predictions, for example, from 5 minutes to 65 minutes into the future. The DL system was trained on three imagery input scenarios: a) satellite (GOES- 16) only, b) radar only, and c) satellite and radar together. Input to all three input scenarios also included beacon data information.
[0079] FIG. 5A, FIG. 5B, FIG. 5C and FIG. 5D illustrate accuracy, recall, precision and FI recall, respectively, comparing accuracy of three imagery input scenarios and some prior art models on test data of one RF gateway according to various embodiments.
[0080] FIG. 5A illustrates an accuracy plot 500 plotting prediction accuracy made using (a) radar and beacon information 502, (b) GOES- 16 and beacon information 504, (c) radar GOES-16 and beacon information 506, (d) SVM model (prior art) 508 and (e) MLP model (prior art) 509.
[0081] FIG. 5B illustrates a recall plot 510 plotting prediction recall made using (a) radar and beacon information 512, (b) GOES-16 and beacon information 514, (c) radar GOES- 16 and beacon information 516, (d) SVM model (prior art) 518 and (e) MLP model (prior art) 519.
[0082] FIG. 5C illustrates a precision plot 530 plotting prediction precision made using (a) radar and beacon information 532, (b) GOES-16 and beacon information 534, (c) radar GOES-16 and beacon information 536, (d) SVM model (prior art) 538 and (e) MLP model (prior art) 539.
[0083] FIG. 5D illustrates a FI -score plot 540 plotting prediction precision made using (a) radar and beacon information 542, (b) GOES- 16 and beacon information 544, (c) radar GOES-16 and beacon information 546, (d) SVM model (prior art) 548 and (e) MLP model (prior art) 549. Per FIG. 5D, the FI -score of the present teachings outperforms the Fl- scores of the prior art teachings. In particular, a DL system trained with radar and beacon information 542 only outperforms the other scenarios for short term forecasting in terms of fl -score. The DL system trained only on GOES- 16 and beacon information 544 outperforms the other scenarios in long term forecasting. Without limitation, this may be because the GOES- 16 images track the movements of the clouds while the radar images have the weather condition records. Thus, for a short-term prediction radar data is more effective while for a long-term prediction the GOES- 16 data is more effective.
[0084] The performance of the present teachings outperform the other state-of-the art ML models especially for long term predictions. The prior art systems are ML-based rain fade prediction models that only use time series data. The beacon information was used as the time series input for the MLP model 549 (Multi-Layer Perceptron) and the SVM model 548 (Support Vector Machine).
[0085] FIG. 6 illustrates Receiver Operating Characteristic (ROC) curve of a long term prediction scenario of the present teachings versus two ML-based models according to various embodiments.
[0086] A ROC curve depicts a trade-off between the TP rate (TPR) and the FP rate (FPR) by plotting TPR versus FPR at various thresholds. Lowering the classification threshold causes more observations to be classified as positive, increasing the TP rate. A ROC curve 602 of the DL system is closer to the top left of the graph and achieves a high TPR while maintaining a low FPR. The ROC curve 604 for a MLP classifier (in particular) and the ROC curve 606 for a SVM classifier illustrates that the two prior art classifiers cannot well distinguish between the two classes. A ROC curve that is closer to the diagonal, such as ROC curves 604 and 606, imply lower TPR and higher FPR. An Area Under the ROC Curve (AUC) measures performance across all possible classification thresholds. The ROC curve 602 of the present teachings has a higher AUC than the prior art ROC curves 604, 606. The AUC of the ROC curve 602 implies that the DL system of the present teachings better predicts the probability of rain fade than the probability of clear sky.
[0087] FIG. 7 illustrates a confusion matrix of the present teachings when predicting rain fade 60 minutes in the future in various embodiments.
[0088] According to FIG. 7, with a classification threshold of 0.5, the present teachings accurately predict rain fade and clear sky events almost 12 times more than the false labels ((TP+TN) (FN+FP) ~ 12). This illustrates the effectiveness of the present teachings in terms of forecasting the rain fade.
[0089] Having described preferred embodiments of a system and method (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art considering the above teachings. It is therefore to be understood that changes may be made in the embodiments disclosed which are within the scope of the invention as outlined by the appended claims. Having thus described aspects of the invention, with the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.

Claims

CLAIMS We claim as our invention:
1. A method for predicting rain fade for a rain zone using a deep learning system comprising a computer processor, the method comprising: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information comprises image datasets comprising of a cloud view of an Area of Interest (Aol), a geolocation and a timestamp, and the beacon information comprises beacon datasets comprising a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future, wherein the geolocation of one or more of the beacon datasets is located within the Aol, a beacon periodicity of the live beacon information is less than or equal to five (5) minutes, and an image periodicity of the live image information is less than or equal to five (5) minutes.
2. The method of claim 1, wherein the near-future is less than or equal to sixty- five (65) minutes.
3. The method of claim 1, wherein the image periodicity is different than the beacon periodicity, and the method further comprises using a previous copy of the live beacon information or the live image information.
4. The method of claim 1, wherein the image periodicity is different than the beacon periodicity, and the method further comprises skipping a previous copy of the live beacon information or the live image information as necessary for the matching.
5. The method of claim 1, wherein the beacon information is collected at a satellite transceiver and a geolocation of the satellite transceiver is located within the Aol.
6. The method of claim 1, wherein the cloud view comprises a top-view from a satellite of the Aol or a bottom view from a radar of the Aol or a combination thereof.
7. The method of claim 1, wherein the live image information comprises a radar image of the Aol and a ground truth for the Aol.
8. The method of claim 7, wherein the ground truth comprises a rain label and the pre-processing harmonizes the rain labels with current rain fade states of the beacon information.
9. The method of claim 1, wherein the live image information comprises an image of the Aol from a high-altitude platform or satellite and the image comprises images at various spectra.
10. The method of claim 1, wherein the pre-processing harmonizes the live image information to an image resolution.
11. The method of claim 1, wherein the pre-processing harmonizes a coordinate system of the live image information and the live beacon information.
12. The method of claim 1, wherein the training set balances a quantity of clear sky events as compared to a quantify of rain fade events.
13. The method of claim 1, wherein the NN processes the data using a 3D convolution neutral network.
14. The method of claim 1, wherein the NN successively processes the data using a 3D convolution NN, a max pool, a flattening NN and a softmax NN.
15. The method of claim 1, wherein the Aol covers a ground area of at least 32 km X 32 km.
16. The method of claim 1, wherein the Aol is centered over the geolocation of one or more of the beacon datasets.
17. The method of claim 1, wherein the Aol comprises a plurality of Aol, the plurality of Aol are located within in a rain zone and the evaluating predicts the rain fade for the plurality of Aol.
18. The method of claim 1, further comprising proactively managing gateway diversity based on the forecasting.
19. A method for predicting rain fade for a rain zone using a deep learning system comprising a computer processor, the method comprising: training a Neural Network (NN) by importing into the NN a training set of image information and beacon information, wherein the image information comprises image datasets comprising of a cloud view of an Area of Interest (Aol), a geolocation and a timestamp, and the beacon information comprises beacon datasets comprising a beacon strength, a current rain fade state, a geolocation and a timestamp; pre-processing to homogenize and to extract spatially and temporally matching data for the Aol from a live image information and a live beacon information; and forecasting a rain fade based on the data in a near-future, wherein the geolocation of one or more of the beacon datasets is located within the Aol, the near-future is less than or equal to sixty -five (65) minutes, the beacon information is collected at a satellite transceiver and a geolocation of the satellite transceiver is located within the Aol, the live image information comprises an image of the Aol from a satellite, a radar image of the Aol and a ground truth for the Aol, and the NN processes the data using a 3D convolution neutral network.
20. The method of claim 19, wherein the Aol comprises a plurality of Aol, the plurality of Aol are located within a rain zone and the evaluating predicts the rain fade for the plurality of Aol.
PCT/US2022/073767 2021-07-19 2022-07-15 Deep learning for rain fade prediction in satellite communications WO2023004260A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3225182A CA3225182A1 (en) 2021-07-19 2022-07-15 Deep learning for rain fade prediction in satellite communications

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163203351P 2021-07-19 2021-07-19
US63/203,351 2021-07-19
US17/453,258 US20230019771A1 (en) 2021-07-19 2021-11-02 Deep Learning for Rain Fade Prediction in Satellite Communications
US17/453,258 2021-11-02

Publications (1)

Publication Number Publication Date
WO2023004260A1 true WO2023004260A1 (en) 2023-01-26

Family

ID=82932531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/073767 WO2023004260A1 (en) 2021-07-19 2022-07-15 Deep learning for rain fade prediction in satellite communications

Country Status (2)

Country Link
CA (1) CA3225182A1 (en)
WO (1) WO2023004260A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116073893A (en) * 2023-04-06 2023-05-05 西安空间无线电技术研究所 Load system and method for calibrating atmospheric transmission characteristics of multi-band millimeter wave signals

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210058293A1 (en) * 2019-08-20 2021-02-25 Hughes Network Systems, Llc Gateway diversity switching

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210058293A1 (en) * 2019-08-20 2021-02-25 Hughes Network Systems, Llc Gateway diversity switching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN HAONAN ET AL: "A Machine Learning System for Precipitation Estimation Using Satellite and Ground Radar Network Observations", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE, USA, vol. 58, no. 2, 1 February 2020 (2020-02-01), pages 982 - 994, XP011767306, ISSN: 0196-2892, [retrieved on 20200120], DOI: 10.1109/TGRS.2019.2942280 *
MISHRA KUMAR VIJAY ET AL: "Deep Rainrate Estimation from Highly Attenuated Downlink Signals of Ground-Based Communications Satellite Terminals", ICASSP 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), IEEE, 4 May 2020 (2020-05-04), pages 9021 - 9025, XP033794385, DOI: 10.1109/ICASSP40776.2020.9054729 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116073893A (en) * 2023-04-06 2023-05-05 西安空间无线电技术研究所 Load system and method for calibrating atmospheric transmission characteristics of multi-band millimeter wave signals
CN116073893B (en) * 2023-04-06 2023-07-18 西安空间无线电技术研究所 Load system and method for calibrating atmospheric transmission characteristics of multi-band millimeter wave signals

Also Published As

Publication number Publication date
CA3225182A1 (en) 2023-01-26

Similar Documents

Publication Publication Date Title
US11237299B2 (en) Self-learning nowcast system for modeling, recording, and predicting convective weather
Chandrasekar et al. Principles of high-resolution radar network for hazard mitigation and disaster management in an urban environment
Ferdowsi et al. Deep learning for rain fade prediction in satellite communications
EP3761522B1 (en) Communications system having interference mitigation for non-geostationary weather satellite and associate method
Leinonen et al. Nowcasting thunderstorm hazards using machine learning: the impact of data sources on performance
WO2023004260A1 (en) Deep learning for rain fade prediction in satellite communications
Guèye et al. Weather regimes over Senegal during the summer monsoon season using self-organizing maps and hierarchical ascendant classification. Part I: synoptic time scale
US20220150738A1 (en) Delta coding for remote sensing
Jeannin et al. Smart gateways switching control algorithms based on tropospheric propagation forecasts
US11476921B2 (en) Sending environmental data on an uplink
US20230019771A1 (en) Deep Learning for Rain Fade Prediction in Satellite Communications
Giro et al. Real-time rainfall estimation using satellite signals: Development and assessment of a new procedure
Hodges et al. An attenuation time series model for propagation forecasting
Ostrometzky et al. Stand-alone, affordable IoT satellite terminals and their opportunistic use for rain monitoring
CN112783192B (en) Unmanned aerial vehicle path planning method, device, equipment and storage medium
Liu et al. Fade slope analysis of Ka-band Earth-LEO satellite links using a synthetic rain field model
Luini et al. Predicting total tropospheric attenuation on monthly basis
EP1763154A1 (en) Generation of propagation attenuation time series
JP6997664B2 (en) Status judgment device
Stoll et al. The impact of collision avoidance maneuvers on satellite constellation management
Alliss et al. Realtime atmospheric decision aids in support of the lunar laser communications demonstration
Chkeir et al. A New Extreme Weather Nowcasting Product Supporting Aviation Management at Local Scale
Jie et al. Preliminary Environmental Risk Consideration for Small UAV Ground Risk Mapping
Attema et al. Sentinel-1 flexible dynamic block adaptive quantizer
Pastoriza et al. Nowcasting of the spatiotemporal rain field evolution for radio propagation studies

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22755037

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 3225182

Country of ref document: CA

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112024001052

Country of ref document: BR

WWE Wipo information: entry into national phase

Ref document number: 2022755037

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022755037

Country of ref document: EP

Effective date: 20240219