WO2002067013A2 - Procede et appareil de prediction a court terme de temps de convection - Google Patents
Procede et appareil de prediction a court terme de temps de convection Download PDFInfo
- Publication number
- WO2002067013A2 WO2002067013A2 PCT/US2002/004512 US0204512W WO02067013A2 WO 2002067013 A2 WO2002067013 A2 WO 2002067013A2 US 0204512 W US0204512 W US 0204512W WO 02067013 A2 WO02067013 A2 WO 02067013A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- meteorological
- filtering
- interest
- growth
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01W—METEOROLOGY
- G01W1/00—Meteorology
- G01W1/10—Devices for predicting weather conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/95—Radar or analogous systems specially adapted for specific applications for meteorological use
- G01S13/951—Radar or analogous systems specially adapted for specific applications for meteorological use ground based
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- the invention relates generally to weather image processing and more specifically to image processors that generate short-term predictions of convective weather based on determination of the growth and decay of convective weather events.
- Short-term weather predictions (e.g., 10-120 minutes) of the location and severity of storms are extremely important to many sectors of the population. For example, aviation systems, traffic information systems, power companies and commuters realize important safety and economic benefits from accurate predictions of storms. Short-term forecasts are particularly important for convective storms, such as thunderstorms, in which individual cells can exhibit a lifecycle less than the short-term forecast period. The challenge for any short-term forecaster is generating a forecast that is both accurate and reliable. [0005] Some methods of generating short-term convective weather forecasts are partially automated, relying on operator invention. These approaches can offer acceptable predictabilities, however, can require significant operator interaction.
- the present invention relates to an automated weather forecaster processing meteorological images from remotely sensed weather indicators to estimate the short- term growth and decay of convective meteorological events, such as thunderstorms.
- the present invention overcomes many of the disadvantages of prior art systems by providing a fully- automated system that provides substantial improvements in accuracy and reliability.
- the invention relates to a process for a computer- assisted prediction of near-term development of convective meteorological events including the steps of determining a difference image by advecting a first meteorological image and combining the advected first meteorological image and a second meteorological image.
- the first and second meteorological images include data indicative of a first forecast parameter at a first time and a second time, respectively.
- the process further includes the steps of generating an interest image including a region of interest by filtering a third meteorological image and generating a growth image indicative of the occurrence of a convective meteorological event by combining the difference image and the interest image.
- growth indicates both positive growth and negative growth (i.e., decay).
- the step of determining the difference image includes subtracting the advected first meteorological image from the second meteorological image. In another embodiment the difference image is determined by averaging a plurality of preliminary difference images. In yet another embodiment the step of generating the interest image includes filtering the third meteorological image to generate a large-scale-feature image and a small- scale-feature image. [0010] In one embodiment the step of filtering the large-scale-feature image comprises low- pass filtering the third meteorological image. In another embodiment the step of low-pass filtering comprises neighborhood-average filtering. In another embodiment the step of filtering the small-scale-feature image comprises high-pass filtering the third meteorological image. In another embodiment the step of high-pass filtering comprises neighborhood-standard-deviation filtering.
- the step of generating the interest image further includes generating a peakiness image identifying cloud peaks indicative of convective weather.
- the step of generating the peakiness image includes the steps of subtracting the large-scale, or average, meteorological image from the third meteorological image.
- generating a forecast includes the additional steps of combining the growth image and the first meteorological image to generate a forecast image identifying the likelihood of convective meteorological events at a third time and advecting the combined image to the third time.
- generating a forecast includes the additional step of classifying weather elements of the first meteorological image, the classifications can be one or more from the group including lines, stratiform regions, large cells, and small cells.
- the invention in another aspect, relates to a process for a computer-assisted prediction of near-term development of convective meteorological events including the steps of determining a difference image by advecting a first precipitation image, and combining the advected first precipitation image and a second precipitation image, the first and second precipitation images indicative of precipitation at a first time and a second time, respectively; generating an interest image including a region of interest by filtering the second precipitation image; and generating a growth image indicative of the occurrence of a convective meteorological event by combining the difference image with the interest image.
- the step of combining the advected first precipitation image and the second precipitation image includes subtracting the manipulated first precipitation image from the second precipitation image.
- the precipitation image includes data representative of vertically integrated liquid (VIL) water content.
- the step of generating the interest image includes filtering the first precipitation image to generate a large-scale-feature image; and filtering the first precipitation image to generate a small-scale-feature image.
- the step of filtering the large-scale-feature image comprises low-pass filtering the first precipitation image.
- the step of filtering the small-scale-feature image comprises high-pass filtering the first precipitation image.
- Another embodiment includes the additional steps of generating a forecast image identifying the likelihood of convective meteorological events at a third time by combining the growth image and a current precipitation image; and advecting the combined image to the third time.
- generating a forecast includes the additional steps of advecting a growth image according to a first advection field; advecting a current precipitation image according to a second advection field, and combining the advected growth image and advected current precipitation image to generate a forecast image.
- the invention relates to a process for a computer-assisted prediction of near-term development of convective meteorological events including the steps of determining a difference image by advecting a first infrared meteorological image and combining the advected first infrared image and a second infrared meteorological image, the first and second infrared meteorological images are indicative of cloud temperature at a first time and a second time, respectively, generating an interest image including a region of interest by filtering a satellite visible meteorological image, and generating a growth image indicative of the occurrence of a convective meteorological event by combining the difference image and the interest image.
- the step of combining the advected first infrared meteorological image and the second infrared meteorological image includes subtracting the another embodiment the step of generating the interest image includes filtering the satellite visible meteorological image to generate a large-scale-feature image and filtering the satellite visible meteorological image to generate a small-scale-feature image.
- the step of filtering the large-scale-feature image includes low-pass filtering the satellite visible meteorological image.
- the step of filtering the small-scale-feature image includes high-pass filtering the satellite visible meteorological image.
- generating a forecast includes the additional step of filtering the satellite visible meteorological image to generate a peakiness image indicative of cumulus clouds.
- generating a forecast includes the additional step of generating a forecast image identifying the likelihood of convective meteorological events at a third time by combining the growth image and a current precipitation image and advecting the combined image to the third time.
- the invention relates to a system for predicting near-term development of convective meteorological events and includes means for advecting a first meteorological image and combining the advected first meteorological image and a second meteorological image to generate a difference image, the first and second meteorological images indicative of a first forecast parameter at a first time and a second time, respectively, filter means for generating an interest image including a region of interest by filtering a third meteorological image, and means for combining the difference image and the interest image to generate a growth image indicative of the occurrence of a convective meteorological event.
- the filter means includes a large-scale feature detector means for filtering the third meteorological image to generate a large-scale-feature image and a small-scale feature detector means for filtering the third meteorological image to generate a small-scale- feature image.
- the large-scale feature detector includes low-pass filter means for generating a low-pass filtered rendition of the third meteorological image and the small-scale feature detector comprises high-pass filter means for generating a high-pass filtered rendition of the third meteorological image.
- the filter means further comprises a peakiness feature-detector means for generating a peakiness image indicative of cumuliform features.
- FIG. 1 is a block diagram depicting a sensing and processing system comprising a short-term convective weather predictor according to the invention
- FIG. 2 is a block diagram depicting an embodiment of a short-term weather predictor
- FIG. 3 is a flow diagram generally illustrating an embodiment of a process for generating short-term convective weather forecasts
- FIG. 4 is a flow diagram illustrating in more detail the step of generating a difference image for the process shown in FIG. 3;
- FIG. 5 is a flow diagram illustrating in more detail the step of generating an interest image for the process shown in FIG. 3;
- FIG. 6 is a flow diagram illustrating in more detail the step of generating a short-term weather forecast for the process shown in FIG. 3 ;
- FIGS. 7 A through 7D are exemplary schematic diagrams illustrating the processing of weather images to generate a short-term convective weather prediction according to the invention.
- FIG. 1 depicts a system block diagram of a weather sensing and prediction system 100 including one embodiment of the invention for predicting the initiation, growth and decay of convective weather, such as cumulus cloud formations and thunderstorms.
- the system 100 includes a short-term storm predictor 102 receiving meteorological data from one or more external sources and a weather forecast display 110.
- the weather forecast display 110 provides an image to an observer or image data appropriate for processing by additional processes or components (not shown).
- the external sources can include weather sensing systems, such as ground-based weather sensors 104 (for example, weather radars, airborne weather sensors, and space-based sensors).
- the external sources can include other computer systems, such as computers forwarding weather image files from the external systems and computers that provide simulated weather image data.
- the satellite data is generally first received by a satellite earth station 108, which performs some pre-processing on the received satellite data and transmits the pre- processed satellite data to the short-term weather predictor 102.
- the received meteorological data is indicative of one or more weather parameters, such as precipitation rate, vertically- integrated-liquid water content (VIL), temperature (e.g., infrared temperature), albedo, lightning occurrences, moisture, and wind-speed.
- the meteorological data can also include numerical model data or computer generated data indicative of any of the above-mentioned weather parameters.
- the meteorological data can be transmitted from the external source 104, 106 in any number of formats, and can be transformed at the source 104, 106, or at an intermediate processing element (e.g., the satellite earth station 108) into other formats, such as a meteorological image.
- the meteorological image includes a multidimensional array, such as a two-dimensional array, of image elements.
- the meteorological image elements include pixel values which are quantitative measures of weather forecast parameters.
- each pixel value can be associated with a number, such as any binary number, representative of a value of a weather parameter (e.g., precipitation rate).
- Each pixel is generally associated with a predetermined geographical location, or geographical area, such that the forecast parameter represented by each pixel is indicative of one or more aspects of the weather at the associated geographical location, or is indicative of an average of spatially- varying weather for the associated geographical area.
- the radar 104 can include a system such as the ASR-9, Terminal Doppler Weather Radar (TDWR) or the Next Generation Weather Radar (NEXRAD).
- the satellite 106 can include a satellite system such as the Geostationary Operational Environmental Satellite (GOES) or the Polar Operational Environmental Satellite (POES).
- the radar 104 and satellite 106 can transmit more than one form of weather-related data.
- the radar 104 can transmit a first channel of data relating to precipitation and a second channel of data relating to VIL.
- the satellite 106 can transmit a first channel of data relating to infrared radiation and a second channel of data relating to albedo.
- the short-term storm predictor 102 processes received data from one or more sources 104, 106 and/or ground feed and predicts the initiation, development and decay of convective weather by identifying areas of growth and decay. [0030] In one embodiment the short-term storm predictor 102 determines a short-term forecast in response to receiving radar data. In another embodiment, the short-term storm predictor 102 determines a short-term forecast in response to receiving satellite data. In another embodiment, the short-term storm predictor 102 determines a short-term forecast in response to receiving radar data and satellite data.
- the short-term storm predictor 102 determines a short-term forecast in response to receiving numerical model data.
- the short-term storm predictor 102 includes an image receiver processor 202 receiving meteorological images from one or more external sources 104, 106.
- the image receiver processor 202 includes output ports 203, 203', 203", each transmitting at least one or more processed meteorological image.
- An image difference processor 204 communicates with at least one of the image receiver processor output ports 203, 203', 203" and receives a first and a second processed meteorological image.
- the image difference processor 204 generates a difference image representing the difference of the first and second processed meteorological images and transmits the difference image to an image growth- and-decay processor 214.
- a number of image filters including a large-scale image filter 206, a small-scale image filter 208, and optionally a "peakiness" filter 210 and/or a classification filter 211 communicate with at least one of the image receiver processor output ports 203, 203', 203".
- Each of the image filters 206, 208, 210, 211 receives a processed meteorological image, such as one of the first and the second meteorological image, or alternatively, a third meteorological image.
- Each of the filters 206, 208, 210, 211 individually filters the received processed meteorological image to produce a respective filtered image, that is provided to an image filter processor 212.
- the image filter processor 212 generates a composite filtered image, or interest image, based on the received filtered images and transmits the interest image to the image growth-and-decay processor 214.
- the interest image identifies various regions of the meteorological image likely to contain convective weather.
- the image growth-and-decay processor 214 generates a growth-and-decay image in response to the difference image and the interest image.
- the image forecast processor 216 communicates with the image growth-and-decay processor 214, the image receiver processor 202 and, optionally, the classification filter 211.
- the image forecast processor 216 receives a processed meteorological image from the image receiver processor 202, the growth-and-decay image from the growth-and-decay processor 214 and, optionally, classification data from the classification filter 211.
- the processed meteorological image can be a precipitation image including data forecast parameters indicative of precipitation rates for an array of geographical locations.
- the processed image can include any parameter indicative of convective weather, such as any of the above mentioned weather parameters.
- the image forecast processor 216 generates a short-term convective weather image for transmission to a weather display unit or other weather processor modules.
- the short-term convective weather forecast image indicates the locations and likelihood of initiation, growth and/or decay of convective weather for a forecast time period that can be up to 120 minutes or more.
- the short-term storm predictor 102 as represented by the filters 206, 208, 210, 211 and processors 202, 204, 212, 216, is implemented in software.
- the implementing software can be a single integrated program or module.
- the implementing software can include separate programs or modules for one or more of the filters 206, 208, 210, 211 and processors 202, 204, 212, 214, 216.
- the short- term storm predictor 102 is implemented in hardware, such as electronic filters, or circuitry implementing digital signal processing.
- the short-term storm predictor 102 is implemented as a combination of software and hardware.
- the image receiver processor 202 receives from one or more external sources 104, 106 meteorological image files representing one or more weather parameters over a known geographical region.
- the received data is in the form of binary files.
- the binary files can be formatted according to standard graphical formats, such as JPEG, GIF, TIFF, bitmap, or, alternatively, the binary files can be formatted in a custom format.
- the image receiver processor 202 receives updated meteorological images from each external source 104, 106 forming a sequence of images representative of weather parameters at different times.
- the individual images represent weather parameter values over substantially the same geographical region, but differing from the previous image by a uniform time interval, e.g., several minutes or more.
- the image receiver processor 202 optionally reformats each received meteorological image from a native format (e.g., bitmap) to a common format suitable for further processing (e.g., TIFF). Alternatively, the image receiver processor 202 inte ⁇ olates and/or extrapolates, as required, the received meteorological image files received from one or more of the remote sources 104, 106, for example, to align the pixel values to a common geographical location, or area. Inte ⁇ olated or extrapolated alignment can be necessary for system configurations in which meteorological images are received from different remote sources 104, 106.
- the image receiver processor 202 can include memory for temporarily storing one or more of the received meteorological and/or processed images, or portions of the same.
- the image difference processor 204 stores at least one of the processed meteorological images, such as the first processed meteorological image indicative of a weather parameter at a first time, as subsequent processed meteorological images are received from the image receiver processor 202.
- the image difference processor 204 calculates a difference image by subtracting a transformed version of the stored first processed meteorological image from a later (e.g., current) meteorological image.
- the image difference processor 204 calculates multiple preliminary difference images. The preliminary difference images are averaged to obtain the difference image.
- the difference image is generally representative of a time-rate-of-change in the processed meteorological image (e.g., a time derivative), which representative of the time-rate-of-change in the corresponding weather parameter.
- the image difference processor 204 determines the difference image by subtracting the previous, stored processed meteorological image from the current processed meteorological image.
- transformation e.g., advection
- Execution of the transformation step prior to computing the difference image reduces and/or eliminates simple movement, or translation, of weather features from introducing a false indication of growth or decay.
- Advection generally refers to the process of translating portions, or sub-regions of the processing image, such as individual pixels, or groups of pixels, according to a transform quantity, such as a vector field indicative of the prevailing winds at different locations.
- the image difference processor 204 having advected the previous processing image then, subtracts the advected processing image from the current processing image. In some embodiments, the image difference processor 204 repeats the difference process as each new image is received in the time series of processing image. The subtraction process operates to identify and quantify areas of growth and/or decay of the weather parameter represented by the pixel values.
- the filters 206, 208, 210, 211 receive a time series of processed meteorological images. In one embodiment, the filters 206, 208, 210, 211 receive the same time series of processed meteorological images as received by the image difference processor 204.
- the image difference processor and the filters 206, 208, 210, 211 each receive processed meteorological images relating to satellite infrared images.
- the image difference processor 204 receives a first meteorological image originating from a first external source, such as the satellite 106 as described above, and the filters 206, 208, 210, 211 receive a second meteorological image originating from a second, or alternative source, such as the radar 104.
- the first and second meteorological images represent weather within the same general geographic region.
- Each of the filters 206, 208, 210, 211 receives the processed meteorological image and generates a filtered image.
- the filtering process can include various filtering methods, such as standard image filtering techniques or functional template correlations, or electrical (e.g., video) filtering of the spectral components, temporal components, or amplitude components of the received image.
- the large-scale filter 206 enhances large-scale features of the processed meteorological image.
- large scale features can be indicative of weather fronts or organized storms.
- the large-scale image features can be enhanced, for example, by attenuating small-scale features.
- the large-scale filter 206 is a low-pass spatial filter, passing image features having low spatial frequency components and attenuating, or eliminating, image features having high-spatial-frequency components.
- the small-scale filter 208 enhances small-scale features, or details, of the received image.
- Small scale features can be indicative, for example, of single storm cells, or cumulus formations of limited geographic extent.
- the small-scale features can be enhanced, for example, by attenuating large-scale features.
- the small-scale filter 208 is a high-pass spatial filter for passing image features having high-spatial-frequency components and attenuating, or eliminating, low-spatial-frequency image features.
- the peakiness filter 210 enhances image features indicative of local maxima within a sub image.
- the peakiness image reflects structural details of the received weather image indicating regions likely to contain cumulus formations.
- the peakiness filter 210 receives a weather image representing albedo.
- the peakiness filter 210 generates a peakiness image by subtracting an average image from the received weather image.
- the large- scale features, or biases, are thus removed leaving the peakiness image.
- the peakiness filter 210 can generate the average image locally, or can use the already-generated average image from the large-scale filter 206.
- the classification filter 211 identifies weather patterns, or details, of the received image. For example, image features referred to as small or large cell can be indicative of single storm cells, or cumulus formations of limited geographic extent. Image features can be further differentiated into line image features and stratiform image features. Line features can be indicative of organized storms, such as those occurring along a weather front and stratiform features can be indicative of large areas of cloud cover, not necessarily associated with convective weather. [0041] The image filter processor 212 generates a composite, filtered image based on filtered images provided by the filters 206, 208, 210, 211. Generally, the composite, filter image emphasizes geographical areas indicative of the initiation, growth and/or decay of convective weather.
- the composite, filter image de-emphasizes geographic areas not associated with the initiation, growth and/or decay of convective weather.
- the de-emphasis process includes identifying areas that can include convective weather within an organized storm that does not exhibit growth or decay.
- the composite, filtered image includes an array of numeric, or scaling values. For example, pixel values in emphasized areas can include increased and pixel values not included in the emphasis areas can be decreased.
- the composite, filtered image can include values of unity for areas of emphasis and values of zero for areas of de-emphasis, effectively forming a mask image, or convective- weather template.
- the image growth-and-decay processor 214 generates a growth-and-decay image based on the difference image and the composite-filtered, or interest image.
- the growth-and-decay image is indicative of sub-regions likely to experience growth and decay within a forecast time.
- the difference image identifies all areas where the monitored weather parameter experienced a growth and decay, it can over predict the initiation, growth and/or decay of convective weather.
- the image growth-and-decay processor 214 applies the emphasis and de-emphasis of the interest image to the difference image to more accurately identify the initiation, growth and/or decay of convective weather.
- the image forecast processor 216 generates a short-term forecast image the processed meteorological image, the growth-and-decay image, and, optionally, the feature classification image.
- the image forecast processor 216 identifies areas within the processed meteorological image likely to experience initiation, growth and/or decay in response to the growth-and-decay image. The identified areas of growth and/or decay can then be predicted using weather models to identify a future weather parameter value within the meteorological image. This process is repeated for each region of the image and the resulting image is transformed through advection to a representative forecast image at the desired forecast time.
- the local image feature speed and direction can be applied to pixels or sub- regions of the processed meteorological image to translate (i.e., vector) its pixels or subregions through a distance, proportional to the forecast time, in the corresponding direction.
- the process operates upon meteorological images received from one or more external sources 104, 106 (Step 300).
- the images identify weather forecast parameters indicative of convective weather.
- the meteorological images are typically comprised of pixels, each pixel including a color and/or intensity indicative of the value, or range, of the corresponding forecast parameter.
- a meteorological image indicative of infrared temperature can be comprised of a two-dimensional array of pixels. Each image pixel is assigned a color value from a predetermined range of colors. Each color represents a predetermined infrared temperature, or sub-range of infrared temperatures. The lowest and highest color values would, for example, correspond to the lowest and highest anticipated infrared temperatures, respectively.
- the image difference processor 204 generates a difference image using a first and second received meteorological image (Step 305).
- the difference image is indicative of a time rate of change in the weather parameter of the received meteorological image.
- the meteorological image represents infrared temperature (e.g., cloud temperatures)
- the difference image generated from the infrared image indicates an increase or decrease in infrared temperature (e.g., a rise or drop in infrared temperature between the two images).
- the difference image is similar in form to the first and second meteorological images (e.g., an array of pixels), but the pixel-value scale of the difference image can be different.
- a growth/decay image is generated from the received difference image and a received interest image (Step 310) identifies areas of the received meteorological images that are likely to be experiencing a growth, or situation in which the portrayed weather parameter is indicative of the growth or initiation of convective weather.
- the term "growth" can at the same time include both positive growth (e.g., cumulus cloud formations increasing in altitude) and negative growth, or decay (e.g., the dissipation of storm cells or cloud formations). Both positive and negative growth are important indicators of forecasted weather.
- a frontal storm can exhibit growth along its leading edge as new storm cells form and at the same time exhibit decay along its trailing edge old storm cells dissipate.
- Step 312 features in the meteorological image are classified into one of a number of predefined categories.
- Examples of weather classifications include lines, stratiform regions, large cells, and small cells.
- the classification filter 211 identifies regions in the meteorological image according to the predefined weather categories.
- a short-term weather forecast is generated using the current meteorological image, the difference image, the generated growth/decay image and, optionally, the weather classification image (Step 315).
- the forecast image generally indicates regions likely to experience, at the forecast time, weather within a predetermined range of severity.
- the image forecast processor 216 transmits an indication of severe weather within a predetermined geographical region.
- the transmitted indication can result in an operator alert of the forecasted weather, such as an audible or visual alarm. For example, when the forecast indicates that, within a sector of airspace being controlled by an air traffic controller, there is a substantial likelihood of severe weather occurring at the forecast time, an alarm can be activated to alert the operator as to the situation.
- the difference image can be generated by advecting a first received meteorological image and subtracting the advected image from a second meteorological image (Step 400).
- the first and second meteorological images are representative of a weather parameter at a first time and a second time, respectively, for a common geographical regions.
- the first meteorological image can be representative of VIL for a predetermined geographical region at a first (reference) time; whereas, the second meteorological image can be representative of VIL for the same geographical region at a second (later) time.
- the step of advecting the first image includes translating sub- regions of the first image according to an advection field.
- the first image is advected to represent an estimate of the first meteorological image at a second time.
- the advection field includes an array of vector elements overlaying the geographical area of the first image. Each vector element of the advection field is indicative of a velocity (direction and speed) of the forecasted parameter at the location of the vector element.
- Generation of the advected image can then be accomplished by translating sub-regions of the first meteorological image from their sensed locations at the first time to estimated locations at the second time according to the advection field vector elements.
- the direction of each sub-region translation is determined from the direction represented by the advection vector element associated with the sub-region.
- the distance of the translation of each sub-region is determined from the magnitude (i.e., speed) represented by the advection vector element by first multiplying the speed by the time difference measured by subtracting the second time from the first time.
- the advection field is generated by tracking the movement of identifiable features over successive meteorological images.
- the advection field is updated with the reception of each new updated meteorological image.
- a second meteorological image is received at a second time (step 405).
- a difference image is generated by subtracting the advected image, representative of the first meteorological image at the second time, from the received second image (step 415). For instances in which there is little or no change in the weather parameter, the resulting difference image exhibits little or no change. For example, when a region of the advected first image is substantially equivalent to the corresponding region of the second meteorological image, the pixel values for those regions in the difference image are approximately zero. Conversely, when new storm cells are initiated, or the extent of already-identified storm cells increases or decreases, the difference image yields pixel values corresponding to the magnitude of the change. [0053] In more detail, referring now to FIG.
- the interest image is generated by first identifying large-scale image features such as those associated with a line, or frontal storm.
- the large-scale image features can be identified using standard image-processing techniques, such as low-pass spatial filtering of the received image.
- a low-pass filter can be implemented by calculating an average at each pixel of values of the surrounding pixels within a predetermined area and replacing the value of the pixel with the value of the calculated average.
- the process is repeated at each pixel in the image.
- the predetermined area can be identified by a
- kernel identifying the of surrounding pixels that will be averaged.
- the kernel can be any shape, such as a rectangle, an ellipse, a square, and a circle. Generally, some care is required to select the size of the kernel, such that the low-pass filter distinguishes image features considered large (e.g., larger than a storm cell).
- a scoring function is also applied in combination with the kernel. For example, an average value can be determined through application of the image kernel to average surrounding pixels within the kernel. The scoring function generates an output value for the processed image based on the average value. The scoring function can be used to de-emphasize low average-value pixels and/or emphasize high average-value pixels.
- scoring functions are predetermined one-to-one mappings of output pixel values for a range of input pixel values. Scoring functions can be initially estimated and later refined based on empirical results to improve the overall forecast accuracy. The scoring functions can be defined for any of the processed image features.
- Step 505 Determination of the interest image is also based on identifying small-scale image features.
- the small-scale image features are identified using standard image- processing techniques, such as high-pass spatial filtering of the received image.
- a high-pass filter can be implemented by calculating for each pixel a standard deviation based on the pixel values of predetermined surrounding pixels and replacing the value of the subject pixel with the calculated standard deviation value.
- the predetermined surrounding pixels can be identified using a kernel having a shape that can be the same as the low-pass filter kernel. Alternatively, a kernel having a different shape can also be used as the low-pass filter.
- a scoring function can be applied to emphasize small-scale features and/or de-emphasize large-scale features.
- the interest image can also be further refined by identifying other image details, such as edges, or structure (Step 510).
- peakiness indicative of image features having fine detail such as those associated with cumulus formations are calculated.
- the peakiness image features are identified through standard image-processing techniques, such as convolution filtering of the received image.
- convolution filter can be implemented by calculating an autocorrelation at each pixel of values of the surrounding pixels within a predetermined area and replacing the value of the pixel with the value of the calculated autocorrelation. Care is also required to select the size of the kernel, such that the peakiness filter distinguishes image feature detail consistent with cumulus formation structure.
- the interest image can optionally be based on classifying image details into one of a number of predetermined weather categories. Some examples of weather categories include lines, stratiform regions, large cells, and small cells.
- the image features can be classified through standard image-processing techniques, such as pattern recognition. For example, a number of different kernels can be used to process the image in which each kernel is indicative of at least one of the storm classifications being determined.
- the interest image is generated from the received filtered images from implementation of the various spatial filters (Step 515). The interest image identifies areas of the received meteorological image that are likely to contain features indicative of a convective weather event.
- short-term convective weather forecast are generated by first identifying a first forecast time (Step 600).
- the forecast time is selected as a time ranging from a several minutes to several hours.
- the forecast time is generally measured from the time of the latest received meteorological image.
- a probability of convective weather of a predetermined category, or range of categories is generated at a first forecast time (Step 605).
- the generated probability of convective weather image is then advected according to an advection field, to the first forecast time (step ⁇ l 0) representing the forecast of convective weather.
- Image filtering can be applied to the advected forecast image to smooth edges and fill in any discontinuities in the image (e.g., speckling, or holes). This last stage of image filtering is not driven by the forecast, but rather the physical realities of the weather. The weather is not prone to abrupt changes in location, but rather exhibits some degree of smoothing.
- convective weather forecasts are generated at multiple "look ahead" times. For example, forecasts at 30 minutes, 60 minutes, 90 minutes, and 120 minutes can be generated from the same received weather images. To accomplish this, a new forecast time is identified (step 615) and the process repeats from step 605. The results of previous forecasts derived from previously received weather images can be stored and compared to the received weather images to determine the accuracy of the forecasts (i.e., scoring) (step 620).
- the short-term predictor 102 generates a short-term storm forecast responsive to receiving satellite meteorological images.
- the image receive processor 202 receives a visible satellite image representative of albedo.
- the image is pre-processed by the receive processor 202, for example, to remove pixels for which the albedo value is below a predetermined threshold, such as 0.18, indicating a lack of significant cloud formations. This preprocessing can simplify subsequent processing by removing or ignoring pixels that are not indicative of convective weather.
- the large-scale filter 206 receives the preprocessed image and generates a visible large scale image by performing a spatial, or neighborhood averaging of the received image.
- the large-scale filter uses a 15 pixel-by-15 pixel kernel.
- the large-scale filter 206 centers the kernel on a pixel of the received (or reprocessed) meteorological image and averages all pixels of the received image within the boundaries of the kernel. The resulting averaged value replaces the pixel value in the received image. The kernel is subsequently moved to another pixel in the image, and this process is repeated until averages have been computed for substantially all pixels.
- the small-scale filter 208 receives the pre-processed image and generates a small-scale image by performing a spatial standard deviation of the received image pixels. A 15 pixel-by-15 pixel kernel is used to determine the set of pixels for calculation of the standard deviation.
- the peakiness filter 210 receives the pre-processed image and generates a peakiness image by filtering the image to accentuate cloud peaks of the received image. In one embodiment, the peakiness image is computed by subtracting the large-scale image from the visible image.
- the image filter processor 212 receives the large-scale image, the small scale image and the peakiness image and generates an interest image.
- the interest image is generated by assigning an interest value to pixels or regions of the processed images for which the standard deviation is high and the large-scale filtered value is low. This typically includes cumulus existing in a region outside of an organized storm region.
- the resulting interest image is further processed to fill in holes, or gaps, and generally, to smooth the appearance of the image.
- the image filter processor 212 filters the image using an image processing concepts of "dilate” and "erode.”
- a kernel such as a 5 pixel-by-5 pixel kernel is applied to each pixel of the interest image.
- a dilate image is computed by replacing the value of a pixel with the maximum pixel value of a group of pixels identified by the kernel.
- the replacement process can be one or more times.
- a kernel is applied to each pixel of the interest image and an erode is generated by replacing the value of the center pixel with the minimum value of the group of pixels identified by the kernel.
- the erode process can be repeated a second time.
- the image filter processor 212 then transmits the resulting interest image to the image growth-and- decay processor 214.
- the growth-and-decay processor 214 also receives the difference image from the image difference processor 204 indicative of the growth and/or decay of cumulus elements.
- the image growth-and-decay processor 214 then generates the growth/decay image by identifying weather severity levels based upon the received images. For all other image regions containing data, the weather level is set to level 2. A zero value is assigned to all other regions of the image.
- the image receive processor 202 receives a radar data image representative of precipitation (e.g., VIL).
- the image difference processor 204 computes a precipitation difference image indicating areas of increasing and/or decreasing precipitation.
- the small-scale filter 208 generates a small-scale image by taking the spatial standard deviation of the received image.
- the image growth-and- decay processor removes pixels from the difference image if the precipitation is below a predetermined level, or masks regions of the difference image for which the difference values are below a predetermined value.
- FIG. 7A represents a simplified first meteorological image, such as a radar image including a first weather element 702 indicative of a forecast parameter, such as VIL.
- the first weather element 702 is shown optionally in relation to a graticule 700 (shown in phantom).
- the graticule 700 assists in identifying relative movement and location of the first weather element 702.
- FIG. 7B represents a simplified second meteorological image, such as a second radar image obtained from the same radar as the first image, but at a later time.
- the weather image includes a later representation of the first weather element 702'.
- a comparison of weather element 702' to weather element 702 indicates that the weather element 702 has moved to a new location and increased in size (e.g., northeastward in this example, with north being represented by the twelve o'clock position of the graticule 700).
- the second meteorological image also includes additional weather elements 703, 704, 705 appearing for the first time.
- FIG. 7C a difference image is shown representing the results of subtracting the advected first meteorological image illustrated in FIG. 7A, from the second meteorological image illustrated in FIG. 7B.
- a first difference weather element 702" results from the increase in storm size.
- the new weather elements 703, 704, 705 appear substantially unchanged because they were not present in the first meteorological image.
- Applying the large- scale spatial filter to FIG. 7B will result in a large-scale image (not shown) that includes the first weather element 702', but not the new weather elements 703, 704, 705.
- a small-scale, or standard-deviation, filter to FIG.
- FIG. 7B results in a small-scale image illustrated in FIG. 7D that includes the new weather elements 703, 704, 705.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Environmental Sciences (AREA)
- Electromagnetism (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un procédé et un appareil permettant de prévoir l'apparition probable d'événement de temps de convection, tels que des orages. On utilise un filtre d'image pour identifier des zones considérées dans une image météorologique contenant probablement un temps de convection. Le filtre d'image et le processeur de différence d'image identifient des zones de l'image météorologique susceptibles de faire face à une évolution positive et/ou négative d'événements météorologiques. L'image météorologique, l'image considérée et l'image d'évolution positive/négative sont traitées afin de produire une prévision à court terme.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US26995501P | 2001-02-20 | 2001-02-20 | |
US60/269,955 | 2001-02-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2002067013A2 true WO2002067013A2 (fr) | 2002-08-29 |
WO2002067013A3 WO2002067013A3 (fr) | 2003-04-10 |
Family
ID=23029302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2002/004512 WO2002067013A2 (fr) | 2001-02-20 | 2002-02-19 | Procede et appareil de prediction a court terme de temps de convection |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2002067013A2 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10313653A1 (de) * | 2003-03-26 | 2004-10-14 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Verfahren und System zur regionalen Niederschlagsvorhersage für drahtlose Endgeräte |
CN108764550A (zh) * | 2018-05-18 | 2018-11-06 | 云南电网有限责任公司电力科学研究院 | 基于输电线路信息数据的雷电预警方法及系统 |
CN113075632A (zh) * | 2021-03-15 | 2021-07-06 | 国网河南省电力公司电力科学研究院 | 一种夏季飑线风自动识别和预警方法 |
CN114994794A (zh) * | 2022-06-24 | 2022-09-02 | 昆明学院 | 一种云团无探测数据区的云粒子相态生长方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5959567A (en) * | 1997-12-08 | 1999-09-28 | Massachusetts Institute Of Technology | Method and apparatus for tracking of organized storms |
JP2000009857A (ja) * | 1998-06-26 | 2000-01-14 | Mitsubishi Electric Corp | 気象レーダ装置 |
US6128578A (en) * | 1996-12-26 | 2000-10-03 | Nippon Telegraph And Telephone Corporation | Meteorological radar precipitation pattern prediction method and apparatus |
US6340946B1 (en) * | 2000-08-03 | 2002-01-22 | Massachusetts Institue Of Technology | Method for determining storm predictability |
-
2002
- 2002-02-19 WO PCT/US2002/004512 patent/WO2002067013A2/fr not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6128578A (en) * | 1996-12-26 | 2000-10-03 | Nippon Telegraph And Telephone Corporation | Meteorological radar precipitation pattern prediction method and apparatus |
US5959567A (en) * | 1997-12-08 | 1999-09-28 | Massachusetts Institute Of Technology | Method and apparatus for tracking of organized storms |
JP2000009857A (ja) * | 1998-06-26 | 2000-01-14 | Mitsubishi Electric Corp | 気象レーダ装置 |
US6340946B1 (en) * | 2000-08-03 | 2002-01-22 | Massachusetts Institue Of Technology | Method for determining storm predictability |
Non-Patent Citations (1)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 04, 31 August 2000 (2000-08-31) & JP 2000 009857 A (MITSUBISHI ELECTRIC CORP), 14 January 2000 (2000-01-14) * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10313653A1 (de) * | 2003-03-26 | 2004-10-14 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Verfahren und System zur regionalen Niederschlagsvorhersage für drahtlose Endgeräte |
DE10313653B4 (de) * | 2003-03-26 | 2005-07-07 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Verfahren und System zur regionalen Niederschlagsvorhersage für drahtlose Endgeräte |
CN108764550A (zh) * | 2018-05-18 | 2018-11-06 | 云南电网有限责任公司电力科学研究院 | 基于输电线路信息数据的雷电预警方法及系统 |
CN113075632A (zh) * | 2021-03-15 | 2021-07-06 | 国网河南省电力公司电力科学研究院 | 一种夏季飑线风自动识别和预警方法 |
CN114994794A (zh) * | 2022-06-24 | 2022-09-02 | 昆明学院 | 一种云团无探测数据区的云粒子相态生长方法 |
CN114994794B (zh) * | 2022-06-24 | 2023-05-09 | 昆明学院 | 一种云团无探测数据区的云粒子相态生长方法 |
Also Published As
Publication number | Publication date |
---|---|
WO2002067013A3 (fr) | 2003-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6920233B2 (en) | Method and apparatus for short-term prediction of convective weather | |
CN114365153B (zh) | 预测天气雷达图像 | |
US7391358B2 (en) | Weather radar echo tops forecast generation | |
US9869766B1 (en) | Enhancement of airborne weather radar performance using external weather data | |
US20140362088A1 (en) | Graphical display of radar and radar-like meteorological data | |
JP6949332B2 (ja) | 雷危険度判定装置 | |
Gravelle et al. | Demonstration of a GOES-R satellite convective toolkit to “bridge the gap” between severe weather watches and warnings: An example from the 20 May 2013 Moore, Oklahoma, tornado outbreak | |
US20150309208A1 (en) | Systems and methods for localized hail activity alerts | |
JP2010032383A (ja) | 予報装置、その方法及びプログラム | |
Wong et al. | Towards the blending of NWP with nowcast—Operation experience in B08FDP | |
CN109215275A (zh) | 一种基于温度数据在电网运行中的火灾监测预警方法 | |
KR101263121B1 (ko) | 위성 적외영상 자료를 이용한 현업용 기상레이더 반사도 합성자료의 채프에코 제거 방법 | |
Mohapatra et al. | Status and plans for operational tropical cyclone forecasting and warning systems in the North Indian Ocean region | |
JP4145259B2 (ja) | プログラム、人工衛星調整装置、制御方法 | |
WO2002067013A2 (fr) | Procede et appareil de prediction a court terme de temps de convection | |
Chow et al. | Development of a recurrent Sigma-Pi neural network rainfall forecasting system in Hong Kong | |
Sharma et al. | A review on physical and data-driven based nowcasting methods using sky images | |
Behrangi et al. | Summertime evaluation of REFAME over the Unites States for near real-time high resolution precipitation estimation | |
CN113111528A (zh) | 一种初生对流信息确定方法、装置、设备和存储介质 | |
CN110727719A (zh) | 一种基于动力松弛逼近的闪电定位资料同化方法 | |
Bolla et al. | The tracking and prediction of high intensity rainstorms | |
Kim et al. | Short-term forecasting of typhoon rainfall with a deep-learning-based disaster monitoring model | |
Amjad et al. | Thunderstorms Prediction Using Satellite Images | |
Bradtke et al. | Spatial characteristics of frazil streaks in the Terra Nova Bay Polynya from high-resolution visible satellite imagery | |
Chkeir et al. | A New Extreme Weather Nowcasting Product Supporting Aviation Management at Local Scale |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): CA JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |