CN117950088B - Multi-mode-based precipitation prediction data fusion correction method - Google Patents

Multi-mode-based precipitation prediction data fusion correction method Download PDF

Info

Publication number
CN117950088B
CN117950088B CN202410345983.6A CN202410345983A CN117950088B CN 117950088 B CN117950088 B CN 117950088B CN 202410345983 A CN202410345983 A CN 202410345983A CN 117950088 B CN117950088 B CN 117950088B
Authority
CN
China
Prior art keywords
weather
precipitation
analyzed
target
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410345983.6A
Other languages
Chinese (zh)
Other versions
CN117950088A (en
Inventor
叶昆
闵锦忠
张晓源
李灵杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Manxing Data Technology Co ltd
Original Assignee
Nanjing Manxing Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Manxing Data Technology Co ltd filed Critical Nanjing Manxing Data Technology Co ltd
Priority to CN202410345983.6A priority Critical patent/CN117950088B/en
Publication of CN117950088A publication Critical patent/CN117950088A/en
Application granted granted Critical
Publication of CN117950088B publication Critical patent/CN117950088B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to the technical field of precipitation prediction, in particular to a multimode-based precipitation prediction data fusion correction method. Acquiring weather patterns of a meteorological observation area under different numerical modes, acquiring radar weather patterns as references, and reducing the system error of the numerical modes; dividing the regions of the weather map under each numerical mode, and further extracting the precipitation characteristics under different numerical modes and multi-mode fusion according to the gray texture characteristics in the weather map, thereby acquiring the aggregate precipitation correction forecast by combining the precipitation characteristics of the radar weather map. According to the method, the texture characteristics of the precipitation area in the weather map are analyzed, the radar weather map is used as the verification data for the prediction model training, the accuracy of feature extraction is improved, and the systematic error of a numerical mode is reduced; and the precipitation forecast under different numerical modes and the precipitation forecast integrated by the multi-mode fusion are forecast, so that the accuracy and the integrity of the precipitation forecast are improved.

Description

Multi-mode-based precipitation prediction data fusion correction method
Technical Field
The invention relates to the technical field of precipitation prediction, in particular to a multimode-based precipitation prediction data fusion correction method.
Background
The rainfall forecast is one of important forecast in the weather science, and has important guiding significance and application value for life of various industries and people. The numerical mode can provide more accurate rainfall prediction results by simulating the movement and evolution of the weather system. At present, each country and region has own numerical modes, such as EC in European center, CMA-SH in China east China, and the like. For forecasting weather elements such as precipitation in the same area, different numerical modes may give different results. In practical application, CMA-SH with higher regional applicability is generally used for observing and forecasting precipitation in a middle area; however, the forecast in the single mode is easy to deviate, and the forecast is generally combined with other numerical modes to carry out the collective forecast in the use process.
Because the forecast is often carried out by adopting a neural network model, the meteorological observation data in a numerical mode are converted into an meteorological image, and then the characteristics are extracted to be input into a trained meteorological forecast model for forecast; however, whether the parameters of the numerical mode are set or the file format is converted to cause the problems of precipitation characteristic distortion or difficult extraction of the acquired weather map, or the influence of factors such as atmospheric chaos effect, numerical mode error and the like, unavoidable systematic deviation exists in different numerical mode predictions, and finally the extracted precipitation characteristic is inaccurate, so that the accuracy of precipitation prediction is influenced.
Disclosure of Invention
In order to solve the technical problem that the accuracy of precipitation prediction is low due to inaccurate extraction of precipitation characteristics at present, the invention aims to provide a multimode-based precipitation prediction data fusion correction method, which adopts the following technical scheme:
the invention provides a rainfall forecast data fusion correction method based on multiple modes, which comprises the following steps:
Acquiring a target weather image in a target mode, a reference weather image in a reference mode and a radar weather image of an observation station of a weather observation area according to a preset weather observation time interval in a preset historical time period of a moment to be forecasted;
Taking any one of the target gas image and the reference gas image as a gas image to be analyzed, and the other gas image as a comparison gas image, and carrying out region division on the gas image to be analyzed; taking any region in the gas image to be analyzed as a target region, and acquiring a reference region of the target region in the comparison gas image and a reference weight of the reference region according to region similar characteristics of each region in the target region and the comparison gas image; acquiring a precipitation coefficient of the target area according to the gray information of the target area, the gray information of the reference area and the reference weight; carrying out gray enhancement on the weather image to be analyzed according to the precipitation coefficient; acquiring a fusion gas image of the enhanced target gas image and the enhanced reference gas image at the same moment; acquiring a characteristic map of a dewatering area of all the gas image maps based on a neural network model;
and acquiring an aggregate precipitation correction forecast in a preset future time period at a moment to be forecasted based on a weather forecast model by combining the target weather map, the reference weather map and the characteristic map of the precipitation zone corresponding to the fusion weather map with the characteristic map of the precipitation zone of the radar weather map.
Further, the region dividing method includes:
In the to-be-analyzed weather map, acquiring an edge coefficient of each pixel according to the gray level difference between the pixel and other pixels in the neighborhood, and dividing each pixel into edge pixels or non-edge pixels according to the edge coefficient; acquiring all closed edges formed by edge pixel points in the gas image to be analyzed; and taking all the areas corresponding to the non-edge pixel points in each closed edge as one area in the to-be-analyzed weather map.
Further, the calculation formula of the edge coefficient includes:
; wherein/> For/>, within the meteorological graph to be analyzedEdge coefficients of the individual pixel points; /(I)For/>, within the meteorological graph to be analyzedGray values of the individual pixels; /(I)For/>, within the meteorological graph to be analyzedEighth-neighborhood of pixel points/>Gray values of the individual pixels; /(I)A preset gray threshold value is set; /(I)Is Ai Fosen brackets.
Further, the method for acquiring the reference area and the corresponding reference weight comprises the following steps:
acquiring the reference weight of each region of the target region in the contrast weather map according to a calculation formula of the reference weight; taking the region with the largest reference weight as a reference region of the target region; the calculation formula of the reference weight is as follows:
; wherein/> For comparison of the first/>, in the aerial imageThe first/second of the regions in the weather map to be analyzedReference weights for the individual target areas; /(I)For/>, in the weather map to be analyzedThe area occupation ratio of each target area in the gas image to be analyzed; /(I)For centroid and/>, in the weather map to be analyzedThe number of the region closest to the centroid of the individual target region; /(I)For/>, in the weather map to be analyzedThe area ratio of each area in the gas image to be analyzed; /(I)For comparison of the first/>, in the aerial imageThe area ratio of each area in the contrast weather map; /(I)To compare centroid with the/>The number of the region whose centroid is closest to the center of mass of the individual region; /(I)For comparison of the first/>, in the aerial imageThe area ratio of the individual areas in the comparative weather map.
Further, the formula for calculating the precipitation coefficient includes:
; wherein/> For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)For/>, in the weather map to be analyzedGray standard deviation of all pixel points in each target area; /(I)For/>, in the weather map to be analyzedThe number of all pixel points in the target areas; /(I)For/>, in the weather map to be analyzedSequence numbers of pixel points in the target areas; /(I)For/>, in the weather map to be analyzedFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedThe reference areas of the target areas correspond to the reference weights; /(I)For/>, in the weather map to be analyzedGray standard deviation of all pixel points in a reference area of each target area; /(I)The number of all pixel points in the reference area; /(I)The serial numbers of pixel points in a reference area in the contrast weather map; /(I)For/>, in the weather map to be analyzedFirst/>, in reference area in comparative weather map of each target areaGray values of individual pixels.
Further, the method for gray scale enhancement of the to-be-analyzed aerial image comprises the following steps:
Acquiring an enhanced gray value of a corresponding pixel point according to the precipitation coefficient and gray information of the pixel point in each region in the gas image to be analyzed; and adjusting the gray value of the pixel point in each region in the gas image to be analyzed to correspond to the enhanced gray value, so as to obtain a gray enhanced image of the gas image to be analyzed.
Further, the calculation formula of the enhanced gray value includes:
; wherein/> To be analyzed in the weather mapFirst/>, in the target areaEnhanced gray values of the individual pixels; /(I)For/>, in the weather map to be analyzedFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for individual zones; /(I)The total number of all areas in the weather map to be analyzed; /(I)Is an inverse hyperbolic tangent function; /(I)To be with natural constant/>Is an exponential function of the base.
Further, the method for acquiring the fusion gas image comprises the following steps:
And at each meteorological observation time, fusing the target aerial image after gray level enhancement with the reference aerial image after gray level enhancement through a weighted fusion algorithm to obtain a fused aerial image.
Further, the neural network model for acquiring the characteristic map of the dewatering area is a convolutional neural network.
Further, the method for acquiring the aggregate precipitation correction forecast comprises the following steps:
Respectively acquiring a precipitation time sequence frame image sequence of a target gas image, a reference gas image, a fusion gas image and a radar gas image corresponding to a precipitation area characteristic image in a preset historical time period of a moment to be forecasted;
Constructing training data of a first weather forecast model according to a precipitation time sequence frame diagram sequence of a target weather diagram and a radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequence of the target weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the target weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the radar weather diagram;
constructing training data of a second weather prediction model according to the precipitation time sequence frame diagram sequences of the fusion weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the fusion weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the fusion weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
Constructing training data of a third weather prediction model according to the precipitation time sequence frame diagram sequences of the reference weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the reference weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the reference weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
In the training data of each weather prediction model, the time corresponding to the tail frame of the sub-sequence corresponding to the tag is different from the time corresponding to the tail frame of the sub-sequence corresponding to the training sample by the time sequence length of a preset future time period in the positive direction of the time sequence; training the first weather prediction model by using training data of the first weather prediction model, training the second weather prediction model by using training data of the second weather prediction model, and training the third weather prediction model by using training data of the third weather prediction model, thereby obtaining a trained first weather prediction model, a trained second weather prediction model and a trained third weather prediction model;
taking the time to be predicted as the tail frame time, and respectively acquiring subsequences with the same time sequence length as the preset future time period from precipitation time sequence frame image sequences of the target gas image, the reference gas image and the fusion gas image as target subsequences to be predicted, reference subsequences to be predicted and fusion subsequences to be predicted; inputting a target subsequence to be predicted into a trained first meteorological prediction network model, and obtaining target precipitation prediction of a moment to be predicted in a future preset time period; inputting the fusion subsequence to be predicted into a trained second weather prediction network model, and obtaining fusion precipitation prediction of the moment to be predicted in a future preset time period; taking the target precipitation forecast and the fusion precipitation forecast as a collection precipitation correction forecast; when the target subsequence to be predicted is missing, inputting the reference subsequence to be predicted into a trained third weather prediction network model, obtaining a reference precipitation forecast of the moment to be predicted in a future preset time period, and taking the reference precipitation forecast as an aggregate precipitation correction forecast.
The invention has the following beneficial effects:
Firstly, acquiring weather patterns of a meteorological observation area under different numerical modes, and acquiring radar weather patterns of the meteorological observation area by an observation station for subsequent reference verification, so as to reduce systematic errors of meteorological observation under the numerical modes; dividing the gas image map under different numerical modes into areas so as to facilitate the subsequent judgment of a precipitation area, and then carrying out gray enhancement on the gas image map by analyzing the gray texture characteristics of a precipitation area and a non-precipitation area so as to facilitate the extraction of precipitation characteristics, and simultaneously acquiring a fusion gas image map under different numerical modes and corresponding precipitation characteristics so as to facilitate the subsequent auxiliary weather staff to analyze, judge and correct; and finally, according to the target gas image, the reference gas image and the characteristic image of the precipitation zone corresponding to the fusion gas image, combining the characteristic image of the precipitation zone of the radar gas image, and obtaining the accurate and comprehensive integrated precipitation correction forecast. According to the invention, the texture characteristics of the precipitation area in the image and the radar weather map acquired by the weather observation station are combined, so that the feature extraction accuracy is accurately improved, the radar weather map is used as the verification data of model training, the system error of a numerical mode is reduced, the precipitation forecast in different numerical modes and the precipitation forecast set forecast integrated by multiple modes are mutually supplemented and corrected, and the precipitation forecast accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for correcting a fusion of rainfall forecast data based on multiple modes according to an embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a detailed description of specific implementation, structure, characteristics and effects thereof based on the multi-mode precipitation forecast data fusion correction method provided by the invention with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention provides a specific scheme of a rainfall forecast data fusion correction method based on multiple modes, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a method for correcting fusion of rainfall forecast data based on multiple modes according to an embodiment of the invention is shown, and the method includes the following steps:
The embodiment of the invention aims at correcting and forecasting precipitation, so that the weather image of a meteorological observation area in different numerical modes is firstly obtained, and the radar weather image of the meteorological observation area is obtained through an observation station; and extracting the precipitation characteristics of different numerical modes and fusion conditions according to the gray texture characteristics in the weather map under the different numerical modes, thereby obtaining the integrated precipitation correction forecast.
Step S1, acquiring a target gas image in a target mode, a reference gas image in a reference mode and a radar gas image of an observation station of a meteorological observation area respectively according to a preset meteorological observation time interval in a preset historical time period of a moment to be forecasted.
In order to forecast the rainfall condition, in one month before the moment to be forecasted, the embodiment of the invention firstly acquires output data of a meteorological observation area in a target forecasting mode and a reference mode respectively at a meteorological observation time interval of 1 hour, acquires a radar weather map as partial verification data in the subsequent forecasting training through an observation station of the meteorological observation area, and performs corresponding preprocessing such as graying, noise reduction and the like on the radar weather map after acquiring the radar weather map; because the meteorological observation area in the embodiment of the invention is a China area, the target mode is a China meteorological office-Shanghai numerical mode (China Meteorological Administration, CMA-SH) with stronger area adaptability, and the reference mode is a European center Fine grid numerical mode (European Centre for Medium-RANGE WEATHER Forecasts, EC-Fine); after obtaining output data in CMA-SH and EC-Fine modes, certain pretreatment is needed to obtain corresponding weather patterns.
In a preferred embodiment of the present invention, output data in GRIB format in CMA-SH mode and output data in GRIB format in EC-Fine mode obtained at each observation time are converted into corresponding output files in NetCDF format, and then output files in NetCDF format in two numerical modes are respectively converted into grid images in TIFF format to obtain a target gas image in CMA-SH mode and a reference gas image in EC-Fine mode, wherein resolution parameters are required to be set to 5km in the process of converting into grid images, that is, the corresponding size of each pixel is 5km in meteorological observation area, and the numerical model application, the acquisition processing of radar gas image and the file format output conversion are all well known in the prior art by those skilled in the art, and are not described herein.
It should be noted that, in other embodiments of the present invention, an operator may set a preset history period and an observation time interval of other durations, and because the embodiment of the present invention aims at a time-by-time forecast of a meteorological observation area, the corresponding observation time interval is set to be per hour, and the observation time interval may also be set to be 3 hours or be analyzed by using other types of numerical models according to actual requirements, such as a 3-hour forecast.
Step S2, taking any one of the target gas image and the reference gas image as a gas image to be analyzed, and the other gas image as a comparison gas image, and carrying out region division on the gas image to be analyzed; taking any region in the gas image to be analyzed as a target region, and acquiring a reference region of the target region in the comparison gas image and a reference weight of the reference region according to region similar characteristics of each region in the target region and the comparison gas image; acquiring a precipitation coefficient of the target area according to the gray information of the target area, the gray information of the reference area and the reference weight; carrying out gray enhancement on the weather image to be analyzed according to the precipitation coefficient; acquiring a fusion gas image of the enhanced target gas image and the enhanced reference gas image at the same moment; and acquiring the characteristic diagrams of the dewatering areas of all the meteorological diagrams based on the neural network model.
After the gas image maps under different numerical modes are obtained in the step S1, the gas image maps can be correspondingly processed so as to facilitate the extraction of the subsequent precipitation characteristics. In one embodiment of the invention, in order to facilitate the analysis and processing of the weather image, firstly, the weather image in the TIFF format is subjected to corresponding gray scale processing in a normalization mode, namely, the file data value in the TIFF format is mapped to the range of 0-255, and the weather gray scale image in the JPG format is obtained. The normalization mode specifically adopts a maximum value and minimum value normalization method, performs maximum value and minimum value normalization on the data value corresponding to each grid, multiplies 255, and rounds up the obtained product to obtain a corresponding gray value; other gray scale processing methods can be selected by the implementer; it should be noted that, the maximum and minimum normalization and the file format conversion are all well known techniques for those skilled in the art, and are not described herein.
After the gray level processing is carried out on the aerial image, the aerial image can be divided into areas by analyzing texture features. Because the region division modes of the target gas image in the target mode and the reference gas image in the reference mode are the same, in the embodiment of the invention, any one gas image in the target gas image and the reference gas image is taken as the gas image to be analyzed, the other gas image is taken as the comparison gas image, the region division is carried out on the gas image to be analyzed, the gas image to be analyzed is changed, and the region division is carried out on the gas image in the target mode and the reference mode respectively.
Preferably, in one embodiment of the present invention, it is considered that there is a significant gray scale difference between a precipitation area and a non-precipitation area in the weather map, and the larger the gray scale value is, the larger the precipitation amount of the region corresponding to the pixel point is; the region dividing method includes: in the weather map to be analyzed, acquiring an edge coefficient of each pixel according to the gray difference between the pixel and other pixels in the neighborhood, and dividing each pixel into edge pixels or non-edge pixels according to the edge coefficient; acquiring all closed edges formed by edge pixel points in an meteorological graph to be analyzed; and taking the corresponding area of all the non-edge pixel points in each closed edge as one area in the weather chart to be analyzed. Judging the possibility that the pixel points are edges according to the gray level difference between the pixel points and the pixel points in the neighborhood, defining the edge coefficient of each pixel point, judging the edge pixel points and the non-edge pixel points, and then obtaining the closed edges formed by all the edge pixel points through an area growing algorithm so as to obtain a final area dividing result. It should be noted that, the eight neighborhood and region growing algorithm are all well known in the art, and are not described herein.
In one embodiment of the present invention, the calculation formula of the edge coefficient includes:
Wherein, For/>, within the meteorological graph to be analyzedEdge coefficients of the individual pixel points; /(I)For/>, within the meteorological graph to be analyzedGray values of the individual pixels; /(I)For/>, within the meteorological graph to be analyzedEighth-neighborhood of pixel points/>Gray values of the individual pixels; /(I)A preset gray threshold value is set; /(I)Is Ai Fosen brackets, the condition in the brackets is satisfied, the corresponding value is 1, otherwise, the value is 0; in the embodiment of the invention, a gray threshold value/>And 2, the implementer can set other values according to implementation conditions.
In the calculation formula of the edge coefficient, specifically, the gray level difference between each pixel point and the gray level average value of all the pixel points in the eight adjacent areas of the pixel points is obtained, and a condition judgment result is obtained through Ai Fosen brackets; if the gray level difference is smaller than or equal to the preset threshold value, the gray level difference between the pixel point and the pixel point in the neighborhood is smaller, the possibility that the pixel point is a more obvious area boundary is lower, the corresponding edge coefficient is 1, and the pixel point is a non-edge pixel point; otherwise, the probability of the boundary point of the region is high, the corresponding edge coefficient is 0, and the boundary point is the edge pixel point.
Considering that the weather patterns acquired in different numerical modes may have differences due to factors such as mode parameters, the whole weather features should be similar, namely the corresponding precipitation areas in the same weather observation local area under different numerical modes should have similar shapes or similar precipitation levels, and the gray level information and the local area proportion of the areas in the gray level images of the weather patterns should be similar; meanwhile, regional weather features of the same local region in different numerical modes should be in a relationship of correction references; therefore, in the embodiment of the invention, any region in the gas image to be analyzed is taken as a target region, and the reference weight of each region in the comparison gas image is obtained according to the region similar characteristics of each region in the target region and each region in the comparison gas image, so that the regions suspected to be the same local region in the gas image under different numerical modes can be subjected to reference matching, and further the reference regions are screened out to mutually correct and reference, thereby facilitating subsequent image enhancement to extract precipitation characteristics.
Preferably, in one embodiment of the present invention, considering that the area ratio of a certain area in the comparison gas image in the whole comparison gas image is similar to the area ratio of the target area in the whole gas image to be analyzed, the more likely the area is a reference area of the target area; however, the misjudgment possibility of the reference area of the target area is judged only by means of the similar condition of the area occupation ratio of the single area in the whole aerial image, so that the adjacent area closest to the mass center of the target area is obtained simultaneously, the similar condition of the area occupation ratio of the adjacent two areas in the aerial image to be analyzed and the comparison aerial image in the corresponding whole aerial image is analyzed simultaneously, and the reference weight is obtained; the calculation formula of the reference weight comprises:
Wherein, For comparison of the first/>, in the aerial imageThe first/second of the regions in the weather map to be analyzedReference weights for the individual target areas; /(I)For/>, in the weather map to be analyzedThe area occupation ratio of each target area in the gas image to be analyzed; /(I)For centroid and/>, in the weather map to be analyzedThe number of the region closest to the centroid of the individual target region; /(I)For/>, in the weather map to be analyzedThe area ratio of each area in the gas image to be analyzed; /(I)For comparison of the first/>, in the aerial imageThe area ratio of each area in the contrast weather map; To compare centroid with the/> The number of the region whose centroid is closest to the center of mass of the individual region; /(I)For comparison of the first/>, in the aerial imageThe area ratio of the individual areas in the comparative weather map.
In the calculation formula of the reference weight,Representing the ratio of the area ratio of a target area and the area adjacent to the target area in the whole gas image to be analyzedThe ratio of the area ratio of the suspected reference area of the target area in the comparison gas image and the area adjacent to the suspected reference area in the whole comparison gas image is expressed, and the closer the ratio of the area ratio is, the description of the/>, in the comparison gas imageThe more likely the individual regions are to be in contact with the first image of the gas to be analyzedThe target areas are local areas corresponding to the same geographic position of the meteorological observation area; the logical relationship is adjusted by subtracting the ratio difference value from 1, so that the/>, in the contrast weather mapThe greater the reference weight of the individual regions.
After the reference weight of each region in the comparison gas image relative to the target region in the gas image to be analyzed is obtained, the region with the largest reference weight in the comparison gas image is taken as the reference region of the target region; and simultaneously, changing the target area to obtain a reference area of each area in the comparison gas image in the gas image to be analyzed.
Considering that regional weather features of the same local region in different modes can be regarded as mutually correcting reference relations, and the larger the gray value of the region is, the greater the possibility of representing precipitation weather features is; therefore, according to the embodiment of the invention, the precipitation coefficient of the target area is obtained according to the gray information of the target area, the reference weight of the reference area and the gray information; the precipitation coefficient reflects the likelihood that the area is characterized by precipitation weather and the precipitation level, and the more obvious the precipitation characteristic is and the greater the precipitation is, the greater the corresponding precipitation coefficient is.
Preferably, in one embodiment of the present invention, it is considered that the more stable the gray distribution of the pixel points in the same area in the weather chart to be analyzed is and the larger the gray value is, the greater the precipitation possibility is; and if the reference area of the area in another numerical mode also shows similar gray scale characteristics, the more obvious the precipitation characteristics of the area are; based on this, the calculation formula of the precipitation coefficient includes:
Wherein, For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)To be analyzed in the weather mapGray standard deviation of all pixel points in each target area; /(I)For/>, in the weather map to be analyzedThe number of all pixel points in the target areas; /(I)For/>, in the weather map to be analyzedSequence numbers of pixel points in the target areas; /(I)To be analyzed in the weather mapFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedThe reference areas of the target areas correspond to the reference weights; /(I)For/>, in the weather map to be analyzedGray standard deviation of all pixel points in a reference area of each target area; /(I)The number of all pixel points in the reference area; /(I)The serial numbers of pixel points in a reference area in the contrast weather map; /(I)For/>, in the weather map to be analyzedFirst/>, in reference area in comparative weather map of each target areaGray values of individual pixels.
In a calculation formula of the precipitation coefficient, specifically, the precipitation coefficient of the target area is represented by superposing the gray level information of the target area in the gas image to be analyzed and the weighted gray level information of the reference area corresponding to the target area in the comparison gas image; the practitioner can also combine the gray information of the target area and the corresponding reference area through multiplication or exponential function form to comprehensively evaluate the precipitation coefficient of the target area.
And changing the target area and the gas image to be analyzed, and acquiring the precipitation coefficient of each area in the gas image corresponding to the target mode and the reference mode, wherein the precipitation coefficient of the area is the precipitation coefficient of each pixel point in the area, and the precipitation characteristics corresponding to each pixel point are reflected. And gray scale enhancement is carried out on the corresponding gas image to be analyzed under the target mode and the reference mode respectively through the precipitation coefficient, so that the precipitation characteristics in the gas image are more obvious, and the subsequent extraction is facilitated.
Preferably, in one embodiment of the present invention, the method for gray scale enhancement of an aerial image to be analyzed includes: acquiring an enhanced gray value of a corresponding pixel point according to the precipitation coefficient and gray information of the pixel point in each region in the weather map to be analyzed; and adjusting the gray value of the pixel point in each region in the gas image to be analyzed to be a corresponding enhancement gray value, so as to obtain a gray enhancement image of the gas image to be analyzed. Wherein, the calculation formula of the enhanced gray value comprises:
Wherein, For/>, in the weather map to be analyzedFirst/>, in the target areaEnhanced gray values of the individual pixels; /(I)For/>, in the weather map to be analyzedFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for individual zones; /(I)The total number of all areas in the weather map to be analyzed; /(I)Is an inverse hyperbolic tangent function; /(I)To be with natural constant/>Is an exponential function of the base.
The calculation formula of the enhanced gray value,The larger the precipitation coefficient is, the more likely the precipitation coefficient is to be a precipitation region, the larger the corresponding gray level enhancement degree is, and the magnitude level of the precipitation coefficient is amplified by an anti-hyperbolic tangent function method; and then mapping the negative correlation of the gray scale enhancement method into an exponential function for normalization, and multiplying the normalized value by the gray scale value of a corresponding pixel point in the target area to enhance the original gray scale, so that the gray scale characteristics of the precipitation area are more obvious compared with other areas.
In addition, whether the target mode or the reference mode is adopted, weather observation results obtained in the corresponding single numerical mode are possibly deviated due to influence of random errors and other factors, so that accuracy of subsequent forecasting results is affected; therefore, after the enhanced gas image diagrams in the single numerical mode of the target mode and the reference mode are respectively acquired, the fused gas image diagram of the enhanced target gas image diagram and the enhanced reference gas image diagram obtained at the same observation time is acquired at the same time and is used as a fused reference correction weather information, so that the reference correction of the forecast result in the target mode by the staff of the subsequent weather station is facilitated.
Considering that Data Fusion (Data Fusion) intelligently synthesizes multi-source image Data of the same region, and more accurate, complete and reliable estimation and judgment can be realized relative to a single information source, a weighted Fusion method is specifically used in one embodiment of the invention to fuse a target gas image and a reference gas image at the same observation time to obtain a fused gas image. In other embodiments of the present invention, the practitioner may also acquire the fused weather map by using other fusion methods such as K-T transformation and HIS transformation according to the specific implementation situation; it should be noted that the data fusion operation is a well-known prior art for those skilled in the art, and will not be described herein.
After the enhanced target gas image, the reference gas image and the fusion gas image thereof are obtained, the dewatering areas of all the gas images can be extracted through a neural network model. In one embodiment of the invention, a convolutional neural network (Convolutional Neural Network, CNN) is specifically adopted to extract the precipitation characteristic areas in all the weather patterns; firstly, preparing a part of target gas image, a reference gas image and a fusion gas image thereof in advance, and mixing four gas image mixtures of radar gas image acquired by a weather observation station to respectively form a training set, a verification set and a test set of a model, wherein the convolution kernel size of a network model convolution layer is set as followsDuring training, a cross entropy loss function is selected to guide model training, an Adam algorithm is utilized to conduct model training optimization, and training is stopped when model loss converges in the iterative optimization process; and inputting all the enhanced target gas image, the reference gas image and the fusion gas image thereof which are obtained in a preset historical time period of the moment to be forecasted and the radar gas image obtained by the weather observation station into a trained convolutional neural network model, and obtaining the rainfall area characteristic images of all the gas image. It should be noted that, the training and the application of the convolutional neural network are well known in the art, and are not described herein.
And step S3, acquiring an aggregate precipitation correction forecast in a preset future time period at a moment to be forecasted based on a weather forecast model by combining the target gas image, the reference gas image and the characteristic images of the precipitation areas corresponding to the fusion gas image with the characteristic images of the precipitation areas of the radar gas image.
After the rainfall area characteristic diagrams of various weather diagrams are obtained, the target weather diagram, the reference weather diagram and the corresponding fusion weather diagram in the historical period are considered to have time sequence, and the corresponding rainfall area characteristic diagram also has time sequence, so that the future rainfall characteristic diagram can be predicted according to the time sequence, and meanwhile, the radar weather diagram obtained by the observation station can be used as the reference check weather information of each observation time so as to improve the accuracy of weather prediction.
Preferably, in one embodiment of the present invention, the method for acquiring the aggregate precipitation correction forecast includes:
Respectively acquiring a precipitation time sequence frame image sequence of a target gas image, a reference gas image, a fusion gas image and a radar gas image corresponding to a precipitation area characteristic image in a preset historical time period of a moment to be forecasted;
Constructing training data of a first weather forecast model according to a precipitation time sequence frame diagram sequence of a target weather diagram and a radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequence of the target weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the target weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the radar weather diagram;
constructing training data of a second weather prediction model according to the precipitation time sequence frame diagram sequences of the fusion weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the fusion weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the fusion weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
Constructing training data of a third weather prediction model according to the precipitation time sequence frame diagram sequences of the reference weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the reference weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the reference weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
In the training data of each weather prediction model, the time corresponding to the tail frame of the sub-sequence corresponding to the tag is different from the time corresponding to the tail frame of the sub-sequence corresponding to the training sample by the time sequence length of a preset future time period in the positive direction of the time sequence; training the first weather prediction model by using training data of the first weather prediction model, training the second weather prediction model by using training data of the second weather prediction model, and training the third weather prediction model by using training data of the third weather prediction model, thereby obtaining a trained first weather prediction model, a trained second weather prediction model and a trained third weather prediction model;
Taking the time to be predicted as the tail frame time, and respectively acquiring subsequences with the same time sequence length as the preset future time period from precipitation time sequence frame image sequences of the target gas image, the reference gas image and the fusion gas image as target subsequences to be predicted, reference subsequences to be predicted and fusion subsequences to be predicted; inputting a target subsequence to be predicted into a trained first meteorological prediction network model, and obtaining target precipitation prediction of a moment to be predicted in a future preset time period; inputting the fusion subsequence to be predicted into a trained second weather prediction network model, and obtaining fusion precipitation prediction of the moment to be predicted in a future preset time period; taking the target precipitation forecast and the fusion precipitation forecast as the aggregate precipitation correction forecast; when the target subsequence to be predicted is missing, inputting the reference subsequence to be predicted into a trained third weather prediction network model, obtaining a reference precipitation forecast of the moment to be predicted in a future preset time period, and taking the reference precipitation forecast as an aggregate precipitation correction forecast.
It should be noted that, the embodiment of the present invention aims at acquiring the weather information of the correction and forecast of the precipitation from hour to hour within the future 0 to 72 hours, so that the cyclic neural network (Recurrent Neural Network, RNN) is used as the network model of the weather forecast for training, and since the training process and application of the RNN are both the prior art well known to those skilled in the art, and since the training process of each weather forecast model is consistent, the acquiring steps of the training data set are also consistent, so that only the training process of the first weather forecast model in the above preferred embodiment of the present invention for acquiring the correction and forecast of the aggregate precipitation is briefly described herein;
the training process of the first meteorological prediction model is as follows:
In a precipitation time sequence frame diagram sequence of a characteristic weather map of a precipitation area corresponding to a target weather map, starting from the current observation time, acquiring subsequences in a historical 72-hour time period corresponding to the current observation time along the time sequence reverse direction, taking the subsequence of each observation time as a training sample of a first weather prediction model, taking the subsequence of the precipitation frame diagram corresponding to the time period 72 hours after the observation time as a label of the training sample, and taking the subsequence of the precipitation frame diagram corresponding to the time period 72 hours after the observation time as a label of the training sample in the precipitation time sequence frame diagram sequence of a radar weather map.
Training the training sample as input data of a first weather forecast model, wherein the first weather forecast model gradually adjusts model parameters of the first weather forecast model according to corresponding labels so as to expect more accurate output results, and thus a trained first weather forecast model is obtained. In the process of training the first weather forecast model, the first weather forecast model can learn the precipitation characteristic information of the target weather map and the precipitation characteristic information of the radar weather map at the same time, the precipitation characteristic information of the target weather map and the radar weather map are observed on the same weather observation area at the same time, and the precipitation characteristics of the target weather map and the radar weather map are similar, so that the first weather forecast model obtained after training can forecast the precipitation characteristic of the target weather map similar to the precipitation characteristic of the radar weather map, and the system error of fusion of a single numerical model or a multi-mode numerical model is reduced by combining the radar of the weather station, and the forecast accuracy of the weather forecast model is effectively improved.
Because the training process of the second weather prediction model and the third weather prediction model is consistent with the training process of the first weather prediction model, the obtaining steps and ideas of the training data set are also consistent, and therefore the training process of the second weather prediction model and the third weather prediction model is not repeated here.
Respectively acquiring subsequences within 72 hours of history with the moment to be predicted from precipitation time sequence frame sequences of a target gas image, a reference gas image and a fusion gas image as target subsequences to be predicted, reference subsequences to be predicted and fusion subsequences to be predicted; and inputting the target subsequence to be predicted into a trained first meteorological prediction model, wherein the observation time interval is set to be observed once per hour, the final output is also the predicted target precipitation prediction per hour within 72 hours in the future, the target precipitation prediction is also a precipitation time sequence frame image subsequence, and each frame image represents a predicted precipitation area characteristic image corresponding to each hour in the future. Similarly, inputting the fusion subsequence to be predicted into a trained second weather prediction model to obtain a time-by-time fusion precipitation forecast of the moment to be predicted within 72 hours in the future; and taking the target precipitation forecast and the fusion precipitation forecast as a set precipitation correction forecast to assist weather staff in outputting the final precipitation forecast. And meanwhile, a replacement scheme is needed, and when the target subsequence to be predicted is lost due to random factors so that the prediction cannot be input at the time of prediction, the reference subsequence to be predicted is used as a replacement to be input into a trained third weather prediction model for prediction so as to output the prediction.
The collected rainfall correction forecast at the moment to be predicted is obtained, and the weather station staff can be further assisted to perform corresponding weather analysis and forecast.
In summary, the method comprises the steps of firstly obtaining the weather patterns of the weather observation area in different numerical modes, obtaining the radar weather patterns of the weather observation area through the observation station as a reference verification basis for subsequent model training, and reducing the error of a weather observation system in the numerical mode; dividing the weather patterns under different numerical modes into areas so as to facilitate the subsequent judgment of precipitation areas; and then extracting precipitation characteristics under different numerical modes and fusion conditions according to gray texture characteristics in the gas image, and further obtaining accurate and comprehensive integrated precipitation correction forecast according to the target gas image, the reference gas image and the precipitation area characteristic images corresponding to the fusion gas image by combining the precipitation area characteristic images of the radar gas image as a verification basis. According to the invention, the texture characteristics of the precipitation area in the image and the radar weather map acquired by the weather observation station are combined, so that the feature extraction accuracy is accurately improved, the system error of a numerical mode is reduced through the verification data of the radar weather map as model training, and the precipitation forecast in different numerical modes and the precipitation forecast set forecast integrated in a multi-mode are mutually supplemented and corrected, so that the precipitation forecast accuracy is improved.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.

Claims (9)

1. A method for correcting precipitation forecast data fusion based on multiple modes, which is characterized by comprising the following steps:
Acquiring a target weather image in a target mode, a reference weather image in a reference mode and a radar weather image of an observation station of a weather observation area according to a preset weather observation time interval in a preset historical time period of a moment to be forecasted;
Taking any one of the target gas image and the reference gas image as a gas image to be analyzed, and the other gas image as a comparison gas image, and carrying out region division on the gas image to be analyzed; taking any region in the gas image to be analyzed as a target region, and acquiring a reference region of the target region in the comparison gas image and a reference weight of the reference region according to region similar characteristics of each region in the target region and the comparison gas image; acquiring a precipitation coefficient of the target area according to the gray information of the target area, the gray information of the reference area and the reference weight; carrying out gray enhancement on the weather image to be analyzed according to the precipitation coefficient; acquiring a fusion gas image of the enhanced target gas image and the enhanced reference gas image at the same moment; acquiring a characteristic map of a dewatering area of all the gas image maps based on a neural network model;
acquiring an aggregate precipitation correction forecast in a preset future time period at a moment to be forecasted based on a weather forecast model by combining the target weather map, the reference weather map and a characteristic map of a precipitation zone of a corresponding fusion weather map with the characteristic map of the precipitation zone of the radar weather map;
the acquisition method of the aggregate precipitation correction forecast comprises the following steps:
Respectively acquiring a precipitation time sequence frame image sequence of a target gas image, a reference gas image, a fusion gas image and a radar gas image corresponding to a precipitation area characteristic image in a preset historical time period of a moment to be forecasted;
Constructing training data of a first weather forecast model according to a precipitation time sequence frame diagram sequence of a target weather diagram and a radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequence of the target weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the target weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequence of the radar weather diagram;
constructing training data of a second weather prediction model according to the precipitation time sequence frame diagram sequences of the fusion weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the fusion weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the fusion weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
Constructing training data of a third weather prediction model according to the precipitation time sequence frame diagram sequences of the reference weather diagram and the radar weather diagram, wherein each training sample in the training data is a subsequence with the same time sequence length as a preset future time period in the precipitation time sequence frame diagram sequences of the reference weather diagram, and the label of each training sample is a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the reference weather diagram and a subsequence with the same time sequence length as the training sample in the precipitation time sequence frame diagram sequences of the radar weather diagram;
In the training data of each weather prediction model, the time corresponding to the tail frame of the sub-sequence corresponding to the tag is different from the time corresponding to the tail frame of the sub-sequence corresponding to the training sample by the time sequence length of a preset future time period in the positive direction of the time sequence; training the first weather prediction model by using training data of the first weather prediction model, training the second weather prediction model by using training data of the second weather prediction model, and training the third weather prediction model by using training data of the third weather prediction model, thereby obtaining a trained first weather prediction model, a trained second weather prediction model and a trained third weather prediction model;
taking the time to be predicted as the tail frame time, and respectively acquiring subsequences with the same time sequence length as the preset future time period from precipitation time sequence frame image sequences of the target gas image, the reference gas image and the fusion gas image as target subsequences to be predicted, reference subsequences to be predicted and fusion subsequences to be predicted; inputting a target subsequence to be predicted into a trained first meteorological prediction network model, and obtaining target precipitation prediction of a moment to be predicted in a future preset time period; inputting the fusion subsequence to be predicted into a trained second weather prediction network model, and obtaining fusion precipitation prediction of the moment to be predicted in a future preset time period; taking the target precipitation forecast and the fusion precipitation forecast as a collection precipitation correction forecast; when the target subsequence to be predicted is missing, inputting the reference subsequence to be predicted into a trained third weather prediction network model, obtaining a reference precipitation forecast of the moment to be predicted in a future preset time period, and taking the reference precipitation forecast as an aggregate precipitation correction forecast.
2. The method for correcting the fusion of rainfall forecast data based on multiple modes according to claim 1, wherein the area dividing method comprises the following steps:
In the to-be-analyzed weather map, acquiring an edge coefficient of each pixel according to the gray level difference between the pixel and other pixels in the neighborhood, and dividing each pixel into edge pixels or non-edge pixels according to the edge coefficient; acquiring all closed edges formed by edge pixel points in the gas image to be analyzed; and taking all the areas corresponding to the non-edge pixel points in each closed edge as one area in the to-be-analyzed weather map.
3. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 2, wherein the calculation formula of the edge coefficient comprises:
; wherein/> For/>, within the meteorological graph to be analyzedEdge coefficients of the individual pixel points; /(I)For/>, within the meteorological graph to be analyzedGray values of the individual pixels; /(I)For/>, within the meteorological graph to be analyzedEighth-neighborhood of pixel points/>Gray values of the individual pixels; /(I)A preset gray threshold value is set; /(I)Is Ai Fosen brackets.
4. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 1, wherein the method for acquiring the reference areas and the corresponding reference weights comprises the following steps:
acquiring the reference weight of each region of the target region in the contrast weather map according to a calculation formula of the reference weight; taking the region with the largest reference weight as a reference region of the target region; the calculation formula of the reference weight is as follows:
; wherein/> For comparison of the first/>, in the aerial imageThe first/second of the regions in the weather map to be analyzedReference weights for the individual target areas; /(I)For/>, in the weather map to be analyzedThe area occupation ratio of each target area in the gas image to be analyzed; /(I)For centroid and/>, in the weather map to be analyzedThe number of the region closest to the centroid of the individual target region; /(I)For/>, in the weather map to be analyzedThe area ratio of each area in the gas image to be analyzed; /(I)For comparison of the first/>, in the aerial imageThe area ratio of each area in the contrast weather map; /(I)To compare centroid with the/>The number of the region whose centroid is closest to the center of mass of the individual region; /(I)For comparison of the first/>, in the aerial imageThe area ratio of the individual areas in the comparative weather map.
5. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 1, wherein the calculation formula of the rainfall coefficient comprises:
; wherein/> For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)For/>, in the weather map to be analyzedGray standard deviation of all pixel points in each target area; /(I)For/>, in the weather map to be analyzedThe number of all pixel points in the target areas; /(I)For/>, in the weather map to be analyzedSequence numbers of pixel points in the target areas; /(I)For/>, in the weather map to be analyzedFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedThe reference areas of the target areas correspond to the reference weights; /(I)For/>, in the weather map to be analyzedGray standard deviation of all pixel points in a reference area of each target area; /(I)The number of all pixel points in the reference area; /(I)The serial numbers of pixel points in a reference area in the contrast weather map; /(I)For/>, in the weather map to be analyzedFirst/>, in reference area in comparative weather map of each target areaGray values of individual pixels.
6. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 1, wherein the method for gray scale enhancement of the weather map to be analyzed comprises the following steps:
Acquiring an enhanced gray value of a corresponding pixel point according to the precipitation coefficient and gray information of the pixel point in each region in the gas image to be analyzed; and adjusting the gray value of the pixel point in each region in the gas image to be analyzed to correspond to the enhanced gray value, so as to obtain a gray enhanced image of the gas image to be analyzed.
7. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 6, wherein the calculation formula of the enhanced gray value comprises:
; wherein/> For/>, in the weather map to be analyzedFirst/>, in the target areaEnhanced gray values of the individual pixels; /(I)For/>, in the weather map to be analyzedFirst/>, in the target areaGray values of the individual pixels; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for the individual target areas; /(I)For/>, in the weather map to be analyzedPrecipitation coefficients for individual zones; /(I)The total number of all areas in the weather map to be analyzed; /(I)Is an inverse hyperbolic tangent function; /(I)To be with natural constant/>Is an exponential function of the base.
8. The method for fusion correction of rainfall forecast data based on multiple modes according to claim 1, wherein the method for obtaining the fusion weather map comprises the following steps:
And at each meteorological observation time, fusing the target aerial image after gray level enhancement with the reference aerial image after gray level enhancement through a weighted fusion algorithm to obtain a fused aerial image.
9. The method for correcting the fusion of rainfall forecast data based on the multiple modes according to claim 1, wherein the neural network model for acquiring the characteristic map of the rainfall area is a convolutional neural network.
CN202410345983.6A 2024-03-26 2024-03-26 Multi-mode-based precipitation prediction data fusion correction method Active CN117950088B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410345983.6A CN117950088B (en) 2024-03-26 2024-03-26 Multi-mode-based precipitation prediction data fusion correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410345983.6A CN117950088B (en) 2024-03-26 2024-03-26 Multi-mode-based precipitation prediction data fusion correction method

Publications (2)

Publication Number Publication Date
CN117950088A CN117950088A (en) 2024-04-30
CN117950088B true CN117950088B (en) 2024-06-04

Family

ID=90803267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410345983.6A Active CN117950088B (en) 2024-03-26 2024-03-26 Multi-mode-based precipitation prediction data fusion correction method

Country Status (1)

Country Link
CN (1) CN117950088B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10227872A (en) * 1996-12-13 1998-08-25 Nippon Telegr & Teleph Corp <Ntt> Meteorological image forecast method, equipment and record medium recording meteorological image forecast program
WO2020103677A1 (en) * 2018-11-21 2020-05-28 国网青海省电力公司 Method and device for processing meteorological element data of numerical weather prediction
KR20220095624A (en) * 2020-12-30 2022-07-07 주식회사 스튜디오엑스코 A system for providing rainfall probability information using meteorological image information provided by the Meteorological Agency based on machine learning and a method using the same
KR102434578B1 (en) * 2021-07-09 2022-08-22 한국과학기술정보연구원 Method for predicting rainfall based on artificial intelligence and apparatus implementing the same method
CN115113301A (en) * 2022-08-23 2022-09-27 南京信息工程大学 Emergency short-term forecasting method and system based on multi-source data fusion
CN115544889A (en) * 2022-10-18 2022-12-30 南京信息工程大学 Numerical mode precipitation deviation correction method based on deep learning
CN116679355A (en) * 2023-05-09 2023-09-01 中国科学院重庆绿色智能技术研究院 Precipitation prediction correction method based on cascade Attention-U-Net
CN116931129A (en) * 2023-07-25 2023-10-24 南京信息工程大学 Short-term precipitation prediction method, device, equipment and medium based on multi-mode set
CN117593657A (en) * 2023-11-15 2024-02-23 上海数喆数据科技有限公司 Method and system for processing refined weather forecast data and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10227872A (en) * 1996-12-13 1998-08-25 Nippon Telegr & Teleph Corp <Ntt> Meteorological image forecast method, equipment and record medium recording meteorological image forecast program
WO2020103677A1 (en) * 2018-11-21 2020-05-28 国网青海省电力公司 Method and device for processing meteorological element data of numerical weather prediction
KR20220095624A (en) * 2020-12-30 2022-07-07 주식회사 스튜디오엑스코 A system for providing rainfall probability information using meteorological image information provided by the Meteorological Agency based on machine learning and a method using the same
KR102434578B1 (en) * 2021-07-09 2022-08-22 한국과학기술정보연구원 Method for predicting rainfall based on artificial intelligence and apparatus implementing the same method
CN115113301A (en) * 2022-08-23 2022-09-27 南京信息工程大学 Emergency short-term forecasting method and system based on multi-source data fusion
CN115544889A (en) * 2022-10-18 2022-12-30 南京信息工程大学 Numerical mode precipitation deviation correction method based on deep learning
CN116679355A (en) * 2023-05-09 2023-09-01 中国科学院重庆绿色智能技术研究院 Precipitation prediction correction method based on cascade Attention-U-Net
CN116931129A (en) * 2023-07-25 2023-10-24 南京信息工程大学 Short-term precipitation prediction method, device, equipment and medium based on multi-mode set
CN117593657A (en) * 2023-11-15 2024-02-23 上海数喆数据科技有限公司 Method and system for processing refined weather forecast data and readable storage medium

Also Published As

Publication number Publication date
CN117950088A (en) 2024-04-30

Similar Documents

Publication Publication Date Title
US20220043182A1 (en) Spatial autocorrelation machine learning-based downscaling method and system of satellite precipitation data
CN113449594B (en) Multilayer network combined remote sensing image ground semantic segmentation and area calculation method
CN109946762B (en) Method and system for short-time rainfall forecast based on probability distribution
CN116758059B (en) Visual nondestructive testing method for roadbed and pavement
CN111983732B (en) Rainfall intensity estimation method based on deep learning
CN110555841B (en) SAR image change detection method based on self-attention image fusion and DEC
CN111638565A (en) Method for monitoring rainstorm in mountainous area
CN110826693A (en) Three-dimensional atmospheric temperature profile inversion method and system based on DenseNet convolutional neural network
CN114445634A (en) Sea wave height prediction method and system based on deep learning model
CN110826509A (en) Grassland fence information extraction system and method based on high-resolution remote sensing image
CN116308963B (en) Government affair data analysis method and system
CN116484189A (en) ERA5 precipitation product downscaling method based on deep learning
CN114418211A (en) Method and device for correcting precipitation
CN114648705A (en) Carbon sink monitoring system and method based on satellite remote sensing
CN115237896A (en) Data preprocessing method and system for forecasting air quality based on deep learning
CN111242028A (en) Remote sensing image ground object segmentation method based on U-Net
CN117950088B (en) Multi-mode-based precipitation prediction data fusion correction method
CN117391139A (en) Weather phenomenon prediction correction method based on improved UNet neural network
CN112116569A (en) Photovoltaic power station power generation power prediction method based on shadow recognition
CN114723330B (en) Vegetation change influence factor evaluation method based on structural equation model
CN114419443A (en) Automatic remote-sensing image cultivated land block extraction method and system
CN113344290A (en) Correcting method for sub-season rainfall weather forecast based on U-Net network
CN112149536A (en) Squall line wind speed prediction method
CN117671181B (en) Method and system for constructing smart city contracture map based on big data
CN117079197B (en) Intelligent building site management method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant