CN117853949B - Deep learning method and system for identifying cold front by using satellite cloud image - Google Patents

Deep learning method and system for identifying cold front by using satellite cloud image Download PDF

Info

Publication number
CN117853949B
CN117853949B CN202410256898.2A CN202410256898A CN117853949B CN 117853949 B CN117853949 B CN 117853949B CN 202410256898 A CN202410256898 A CN 202410256898A CN 117853949 B CN117853949 B CN 117853949B
Authority
CN
China
Prior art keywords
cloud
cold front
image
850hpa
satellite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410256898.2A
Other languages
Chinese (zh)
Other versions
CN117853949A (en
Inventor
秦育婧
刘倩
卢楚翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Information Science and Technology
Original Assignee
Nanjing University of Information Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Information Science and Technology filed Critical Nanjing University of Information Science and Technology
Priority to CN202410256898.2A priority Critical patent/CN117853949B/en
Publication of CN117853949A publication Critical patent/CN117853949A/en
Application granted granted Critical
Publication of CN117853949B publication Critical patent/CN117853949B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A deep learning method and system for identifying cold front by using satellite cloud image includes obtaining meteorological data, calculating 850hPa temperature advection, and drawing satellite cloud image and meteorological element image; preprocessing a satellite cloud picture; using the preprocessing cloud image, the sea level barometric pressure image and the 850hPa temperature advection image to manufacture an RGB image for identifying the cold front; utilizing meteorological data, 850hPa temperature advection and a preprocessing cloud picture to manufacture a cold front label set; and inputting the RGB image data set and the cold front label set into the DETR model for training and testing to obtain the cold front identification result. The invention obtains better automatic identification result, realizes the direct identification of the cold front from the image, is beneficial to the automation of identifying the cold front by combining with the satellite cloud image in the modern weather forecast service, can improve the accuracy of cold front position and form identification, and provides reference for the forecast service.

Description

Deep learning method and system for identifying cold front by using satellite cloud image
Technical Field
The invention relates to the technical field of automatic weather system identification, in particular to a deep learning method and system for identifying a cold front by using a satellite cloud image.
Background
The traditional frontal surface identification method mainly utilizes various meteorological elements for analysis, and the application of irregular data such as satellites, radars and the like can enable weather staff to have new knowledge on the structure, weather and analysis and forecast of the frontal surface. The cloud image of the meteorological satellite is an important tool for identifying the frontal surface, and cloud zone boundaries and cloud system dense conditions of the meteorological satellite are helpful for accurately positioning the frontal surface. As a probe with great advantages in terms of temporal and spatial continuity, the use of cloud patterns in front identification is still relatively lacking.
In actual weather service forecasting, the front analysis combined with the cloud image mostly depends on manual work, and mostly focuses on a certain typical process. The automatic frontal surface identification method is mainly divided into two main types at present, wherein the first type is an objective identification algorithm utilizing various meteorological elements, including a main flow thermodynamic method, a wind vector method and the like; the second method is to use deep learning model recognition, but most of existing frontal recognition models use CNN network, some probability prediction methods may reduce accuracy of frontal shape and position, and it is necessary to develop schemes for directly recognizing frontal from pictures, for example, a Mask R-CNN method-based European sub-continental cold front recognition scheme has been proposed. None of these solutions, however, consider the cloud elements nor the long-term satellite cloud cold front dataset. If the frontal surface recognition problem of using satellite cloud images can be solved by utilizing the deep learning model, cloud elements can be supplemented in the frontal surface automatic recognition scheme, and a long-term data set of cloud image cold fronts can be obtained, so that the improvement of cold front service forecast and the research on the cold fronts from the climatology angle are facilitated.
In addition, in the identification process using the satellite cloud image, the cloud area and the ground surface area need to be segmented first, so that some interference of land or water is eliminated. At present, the segmentation and classification methods of satellite cloud pictures mainly focus on two major categories, namely a threshold method, edge detection, region segmentation and the like, and the second category is a method based on a machine learning/deep learning model. However, in the process of identifying the cold front by using the cloud image, the preprocessing of the cloud image is only the preamble step of manufacturing the RGB image, and how to quickly and efficiently realize the processing purpose is the primary purpose of the step. It is therefore also necessary to find a more rapid and efficient cloud image preprocessing scheme.
Disclosure of Invention
The invention aims to solve the technical problems that: the deep learning method and the system for identifying the cold front by using the satellite cloud image are provided, the FY-2 infrared cloud image is preprocessed, other meteorological elements are superimposed to generate an RGB image data set, the RGB image data set and the cold front label set are input into a deep learning model for training and testing, a deep learning scheme for identifying the cold front by using the satellite cloud image is realized, a long-term satellite cloud image cold front data set is obtained, the accuracy of cold front position and form identification is improved, and references are provided for cold front service forecast.
The invention adopts the following technical scheme for solving the technical problems:
the invention provides a deep learning method for identifying cold front by using satellite cloud images, which comprises the following steps:
s1, acquiring meteorological data, calculating 850hPa temperature advection, and drawing an FY-2 infrared cloud image and an air pixel image of a weather satellite No. two of the wind cloud.
S2, preprocessing an FY-2 infrared cloud image of the weather satellite No. two of the wind cloud.
S3, directly reading gray images of the preprocessed wind cloud No. two meteorological satellites FY-2 infrared cloud picture, the sea level barometric map and the 850hPa temperature flat map by using an OpenCV library of Python, respectively placing the gray images into R, G, B channels of RGB images, and overlapping the gray images into an RGB image dataset.
S4, utilizing the meteorological data in the step S1, 850hPa temperature advection and the preprocessed wind cloud No. two meteorological satellite FY-2 infrared cloud pictures in the step S2 to manufacture a cold front label set.
S5, generating a cold front identification data set by utilizing the RGB image data set in the step S3 and the cold front label set in the step S4, inputting the data set into the DETR model for training and testing, and obtaining a cold front identification result.
Further, in step S1, the drawing of the image includes the sub-steps of:
S101, selecting a sea level air pressure field, a ground 10m wind field, a 850hPa temperature field and a wind field, setting spatial resolution, downloading NetCDF-format ERA-5 data and HDF-format wind cloud second weather satellite full-disc nominal image files, selecting a range of 40-160 DEG E and 10-66 DEG N, and calculating 850hPa temperature advection, wherein the specific formula is as follows:
where u denotes a weft horizontal wind of 850hPa, v denotes a warp horizontal wind of 850hPa, The composite wind of u and v, x represents the weft direction, y represents the warp direction, T represents the temperature value obtained by smoothing the temperature of 850hPa, and/(o)Representing the gradient of T.
S102, drawing an FY-2 infrared cloud chart, a sea level barometric pressure chart and an 850hPa temperature flat flow chart of a weather satellite No. two by using a single color chart.
Further, in step S2, the preprocessing includes the following sub-steps:
s201, dividing the wind cloud No. two meteorological satellite FY-2 infrared cloud images in the step S102 into a training set and a verification set according to a set proportion, manufacturing cloud area labels by using a LabelMe tool of Python, generating corresponding annotation files and files for explaining the number of object classes in the labels, setting the number of the classes as one class, obtaining an infrared cloud image cloud area data set, and storing the infrared cloud image cloud area data set according to a VOC format.
S202, training a U-Net model and testing, and performing preliminary segmentation on the cloud picture. The specific contents are as follows: setting data set category, loss function, batch sample number and learning rate, training on a U-Net model by using pre-training weights, stopping until the loss function curve converges, storing a weight file, and inputting the infrared cloud image region data set in the step S201 into the trained U-Net model to obtain a preliminary segmentation result of a cloud image.
S203, reading the preliminary segmentation map, and respectively marking each independent cloud zone in the map. Judging whether the content of the independent cloud zone is: and traversing each grid point in the preliminary segmentation graph, if the gray value of the adjacent grid points is not 0, judging the grid points as the connected grid points, marking the same numerical value, continuing to expand until all grid points which are not 0 around the grid points are marked, and obtaining a connected area with all the grid points marked with the same numerical value in the interior, wherein the connected area is used as an independent cloud area.
S204, reading a gray level map and a cloud partition block file of the satellite cloud map, selecting a time, reading the position of each independent cloud area, corresponding the position to a grid point of the gray level map of the satellite cloud map, extracting the highest gray level value of each independent cloud area and the average gray level value of the cloud area as area representative gray level values, and acquiring screening weights according to the numerical range of the gray level values and the seasonal standard of the values.
S205, processing the images according to the screening weight, and splicing cloud areas to obtain a cloud image preprocessing result. The specific contents are as follows: and selecting one time, reassigning the grid points with gray values smaller than the screening weight in each independent cloud zone to 0, keeping the original pixel values of the rest grid points, and reassigning all independent cloud zones in the time into a graph according to the original position to obtain an infrared cloud graph preprocessing result.
Further, in step S4, the making of the cold front label set includes the following sub-steps:
S401, using ERA-5 sea level air pressure field, ground 10m air field, 850hPa temperature field and air field data, drawing a cold front analysis element diagram corresponding to RGB image time by using Matplotlib library of Python, wherein the diagram comprises 850hPa temperature advection, 850hPa temperature, average sea level air pressure and cold front analysis element diagram of ground 10m air field.
S402, directly drawing cold front thick lines on the cold front analysis element diagram by using the cold front analysis element diagram and the preprocessing cloud diagram. The specific contents are as follows: in the cold front analysis element diagram and the preprocessing cloud diagram, a grid point which meets the requirements of high altitude and cold front cloud system, is positioned in front of a 850hPa cold advection area, has the maximum cyclone curvature of an equivalent line on a ground diagram, has the steering of the ground 10m wind from the bias wind to the south wind and is approximately parallel to the 850hPa isotherm is found, is marked as the cold front grid point, and is connected into an independent and complete cold front thick line.
S403, extracting thick lines of the cold front, setting lattice points determined to be the cold front as white, setting other lattice points as black, and obtaining a black-white binary image.
S404, reading black-and-white binary images of selected times, traversing all grid points, judging one connected area as an independent frontal line, extracting the rough line position of the independent cold front, storing each independent frontal line in a picture in a splitting way according to the corresponding position, processing the images into black-and-white binary images, and naming the cold front label picture according to the COCO format.
Further, in step S5, the obtaining the cold front identification result includes the following sub-steps:
S501, a cold front identification data set in a COCO format is manufactured. The specific contents are as follows: and (3) randomly extracting the RGB image data set in the step (S3) into a training set and a verification set according to a set proportion, generating a corresponding annotation JSON file by using the cold front label set, and combining the training set, the verification set and the annotation file to obtain the cold front identification data set in the COCO format.
S502, inputting a cold front identification data set in a COCO format into a DETR model, setting data set types, loss functions, batch sample numbers, learning rates and weight attenuation, training on a GPU by using pre-training weights, generating a prediction frame with the possibility of cold front, and finishing target detection; setting freezing weight, adding mask head at the top of target detection output, and continuing training to complete target segmentation.
S503, testing RGB images generated at the time of selection to obtain a cold front identification result. The specific contents are as follows: training DETR (DEtection TRansformer) models until the loss function curves are converged, stopping, storing weights, setting the weights of parameters, inputting RGB images generated at the time of selection into the trained DETR models, and obtaining cold front recognition results.
Furthermore, the invention also provides a deep learning system for identifying cold front by using satellite cloud images, which comprises the following steps:
the data acquisition module is used for acquiring meteorological data, calculating 850hPa temperature advection and drawing an FY-2 infrared cloud image and an meteorological element image of the weather satellite No. two.
The image preprocessing module is used for preprocessing the FY-2 infrared cloud image of the weather satellite No. two in the data acquisition module.
The RGB image data set acquisition module is used for reading the gray images of the preprocessed wind cloud No. two weather satellite FY-2 infrared cloud picture, the weather element picture and the 850hPa temperature flat flow picture, respectively placing the gray images into R, G, B channels of the RGB image, and overlapping the gray images into an RGB image data set.
The cold front label set manufacturing module is used for manufacturing the cold front label set by utilizing the meteorological data in the data acquisition module, 850hPa temperature advection and the preprocessed wind cloud No. two meteorological satellite FY-2 infrared cloud image in the image preprocessing module.
The cold front identification acquisition module is used for inputting the RGB image data set in the RGB image data set acquisition module and the cold front label set in the cold front label set making module into the DETR model for training and testing to obtain the cold front identification result.
Furthermore, the invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of the deep learning method for identifying cold front by using satellite cloud images when executing the computer program.
Further, the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program is executed by a processor to execute the deep learning method for identifying cold front by using satellite cloud images.
Compared with the prior art, the invention adopts the technical proposal and has the following remarkable technical effects:
the invention can realize the cold front identification of the satellite cloud image based on the deep learning method on the basis of preprocessing the satellite cloud image, firstly preprocesses the satellite cloud image, puts the satellite cloud image into the RGB image, uses the deep learning model DETR to carry out the thought of cold front segmentation, obtains better automatic identification results, is beneficial to combining the satellite cloud image to identify the cold front automation in the modernized weather forecast service, and provides reference for the forecast service.
Drawings
FIG. 1 is a flow chart of an overall implementation of the present invention.
FIG. 2 is an infrared cloud diagram, a sea level barometric diagram and a 850hPa temperature plateau diagram of a weather satellite FY-2 of the second weather cloud obtained in the embodiment of the invention.
FIG. 3 is a schematic flow chart of the preprocessing of satellite cloud image according to the present invention.
Fig. 4 is a preliminary segmentation map, a segmentation marker schematic and a preprocessing result obtained in the embodiment of the present invention.
Fig. 5 is an RGB diagram produced by three channel placement methods in an embodiment of the invention.
FIG. 6 is a drawing of the elements of the cold front analysis, a drawing of the cold front thick line and an extraction of the cold front thick line obtained in the examples of the present invention.
Fig. 7 is a diagram of a cold front label set and corresponding picture naming obtained in an embodiment of the present invention.
FIG. 8 is a graph comparing cold front test results obtained in the examples of the present invention with a cold front label set.
FIG. 9 is a graph showing the correspondence between cold front test results obtained in the embodiment of the invention and the infrared cloud image, 850hPa and the ground element field of the weather satellite FY-2 of the second weather cloud.
Detailed Description
In order to describe the technical content, constructional features, achieved objects and effects of the present invention in detail, the present invention is described in detail below with reference to the accompanying drawings and examples.
In order to achieve the above objective, the present invention provides a deep learning method for identifying a cold front using a satellite cloud image, as shown in fig. 1, and provides a deep learning method for identifying a cold front using a satellite cloud image, comprising the following steps:
S1, acquiring meteorological data, calculating 850hPa temperature advection, and drawing a satellite cloud image and an meteorological element image, wherein the specific contents are as shown in FIG. 2:
S101, downloading ERA-5 analysis data in a European middle weather forecast center mode, selecting elements of a sea level air pressure field, a ground 10m wind field, a 850hPa temperature field and a wind field, setting the spatial resolution to be 0.25 degrees multiplied by 0.25 degrees, and storing the elements as NetCDF formats; downloading an HDF-format wind cloud second weather satellite full-round disc nominal image file provided by a national satellite weather center wind cloud satellite remote sensing data service network (http:// satellite. Nsmc. Org. Cn); time selection 2005-2022 was 00:00 UTC and 12:00 UTC daily. Selecting a range of 40-160 DEG E and 10-66 DEG N, and calculating 850hPa temperature advection, wherein the specific formula is as follows:
where u denotes a weft horizontal wind of 850hPa, v denotes a warp horizontal wind of 850hPa, The composite wind of u and v, x represents weft direction, y represents warp direction, T represents temperature value obtained by five-point smoothing 100 times at 850hPa, and is/>Representing the gradient of T.
S102, drawing an FY-2 infrared cloud chart, a sea level barometric pressure chart and an 850hPa temperature flat flow chart of a weather satellite No. two by using a single color chart.
Obtaining an infrared cloud picture of a weather satellite FY-2 of a wind cloud II shown in fig. 2 (a), a sea level barometric map shown in fig. 2 (b) and a 850hPa temperature plateau map shown in fig. 2 (c), wherein the picture sizes are [224, 480], the pictures are respectively stored in three folders of CloudImage, slp and Advt, the color map is gist _ yarg _r, and the pictures are stored in a JPG format.
S2, preprocessing an infrared cloud image of a weather satellite FY-2 of the wind cloud II, wherein the specific contents are as shown in fig. 3:
S201, an infrared cloud image of the weather satellite FY-2 of the second weather cloud in the step S102 is calculated according to 8:2, dividing the training set and the verification set, and generating a train. Txt file for storing the file name of the training set and a test. Txt file for storing the file name of the verification set. And manufacturing a cloud area label by using a LabelMe tool of Python, and storing the cloud area label according to the VOC format. The corresponding annotation file is generated and stored in JSON format, and the class txt file is generated to describe the class number (here, the "closed" class) of the objects in the tag set. Converting the JSON format annotation file into a corresponding PNG picture format, storing the PNG picture format annotation file in a SEGMCLASS folder, setting the class number as one class, and obtaining an infrared cloud image cloud region data set.
S202, setting the following training parameters :num_classes=2,input_shape=[224, 480],Unfreeze_batch_size=2,Init_lr=1e-4,Min_lr=Init_lr*0.01,num_workers=4., training a U-Net model (http:// lmb.information k.uni-freiburg.de/people/ronneber/U-Net) on a GPU with a memory of 12GB until a loss function curve is converged, stopping, storing a weight file, inputting an infrared cloud image region dataset in the step S201 into the trained U-Net model, and obtaining 2174 pieces of preliminary segmentation images in 2017-2019, and storing the preliminary segmentation images in a JPG format. As shown in fig. 4 (a), a preliminary segmented image of 2022, 5, 13, 12:00 (coordinated universal time) is shown.
S203, reading the preliminary segmentation map, and respectively marking each independent cloud zone in the map. And traversing each grid point in the preliminary segmentation graph, if the gray value of each adjacent grid point is not 0, judging the grid points as the connected grid points, marking the same numerical value, continuing to expand until all grid points which are not 0 around the grid points are marked, and obtaining a connected area with all the internal grid points marked with the same numerical value, wherein the connected area is an independent cloud area. And 2174 cloud partition block files in total are obtained in 2017-2019 and stored in GRD format. As shown in fig. 4 (b), the preliminary segmentation map of 2022, 5, 13, 12:00 (coordinated universal time) is divided into 23 independent areas, and different cloud segmentation results are represented by different colors.
S204, reading a gray level map and a cloud partition block file of the satellite cloud map, selecting a time, reading the position of each independent cloud area, corresponding the position to a grid point of the gray level map of the satellite cloud map, extracting the highest gray level value of each independent cloud area and the average gray level value of the cloud area as an area representative gray level value (G), and acquiring screening weight (W) according to the numerical range of the gray level value and a seasonal reference (SS) of the value. Wherein the season reference (SS) is taken from month 12-2, month 0.8,3-4, month 11, month 0.7,5-6 and month 9-10, month 0.6,7-8, and month 0.5 in one year; the calculation formula of the screening weight is as follows: specific values of w=g (ss+0.1) when G falls within the range of (230,255) or [0,130], w=g (ss+0.05) when G falls within the range of (205,230) or (130, 155), and w=g (ss.2022, 5 months, 13 days, 12:00 (coordinated universal time) when G falls within the range of (155,205) are shown in table 1.
Table 1 data representing gray values and screening weights for regions
S205, selecting a time, processing the image according to the screening weight, reassigning the grid points with gray values smaller than the screening weight in each independent cloud area to be 0, keeping the original pixel values of the rest grid points, and splicing all independent cloud areas in the time into a graph according to the original position to obtain the preprocessing result of the wind cloud No. two weather satellite FY-2 infrared cloud graph. And 2174 preprocessing cloud pictures in 2017-2019 are obtained and stored in a JPG format. As shown in fig. 4 (c), the pretreatment result of the present time is shown by taking 2022, 5, 13, 12:00 (coordinated universal time) as an example.
S3, as shown in FIG. 5, the OpenCV library of Python is used for reading gray images of the preprocessing cloud image, the meteorological element image and the 850hPa temperature flat flow image, the gray images are respectively placed into R, G, B channels of RGB images, a RGB image data set with the size of [224, 480] is overlapped, the RGB image data set is stored in a JPG format, and the RGB image is named as 'time of year, month and day' (for example, 2022, 5, 13, day, 12:00 (coordinated world time) and named as '2022051312, JPG'). A total of 2174 RGB images were obtained from 2017-2019.
S4, utilizing the meteorological data in the step S1, 850hPa temperature advection and the preprocessing cloud image in the step S2 to manufacture a cold front label set, wherein the specific contents are as shown in fig. 6 and 7:
S401, using ERA-5 sea level air pressure field, ground 10m air field, 850hPa temperature field and air field data, drawing 2174 cold front analysis element graphs in 2017-2019 by using Matplotlib library of Python corresponding to RGB images, wherein the graphs comprise 850hPa temperature advection, 850hPa temperature, average sea level air pressure and cold front analysis element graphs of the ground 10m air field. As shown in fig. 6 (a), for example, in 2018, 4, 11, 12:00 (coordinated universal time), the filled portion indicates 850hPa cold advection, the arrow indicates the ground 10m wind field, the solid line of the contour line indicates the sea level air pressure field, and the broken line indicates the 850hPa temperature field.
S402, in the cold front analysis element diagram and the preprocessing cloud diagram, finding out a grid point which meets the requirements of high altitude and cold front cloud system, is positioned in front of a 850hPa cold advection area, has the maximum cyclone curvature of a contour line on a ground diagram, has the steering of the ground 10m wind from the north wind to the south wind and is approximately parallel to the 850hPa isotherm, marking the grid point as the cold front grid point, and connecting the grid point into an independent and complete cold front thick line by using a color of "#00FF 00". A total of 2174 cold front thick line drawing figures in 2017-2019 are obtained. As shown in fig. 6 (b), for example, in 2018, 4, 11, 12:00 (coordinated universal time), black bold lines are four bold cold fronts drawn.
S403, using an OpenCV library of Python to read lattice points of the pictures R, G, B, wherein the lattice point values of the three channels meet 0, 225 and 0, placing the lattice points in the pictures with the sizes of [224, 480], setting the lattice point with a cold front as 255, setting a blank part as 0, and synthesizing one image. A total of 2174 black and white binary images of cold front thick lines are obtained in 2017-2019. As shown in fig. 6 (c), a black-and-white binary image including four thickened cold fronts at this time is shown by taking 2018, 4, 11, 12:00 (coordinated universal time) as an example.
S404, reading black-and-white binary images, traversing all grid points, judging a communication area as an independent frontal line, extracting the positions of thick lines of independent cold fronts, storing each independent frontal line in a picture with the size of [224, 480] in a stripe mode according to the corresponding positions, setting the pixel value of the cold fronts grid point as 255, setting the other grid points as 0, and adopting the same placing method for each frontal line. Each independent cold front line picture at a certain moment is named in a mode of 'year, month, day, time_type_serial number', wherein the 'serial number' is numbered from 0 and is stored as a PNG format. And obtaining 6671 black and white binary images of the independent cold front thick lines in 2017-2019, namely the cold front label set. As shown in fig. 7, taking 12:00 (coordinated universal time) as an example of 11 th month of 2018 4, four black and white binary images of independent bolded cold fronts at this time are shown, wherein four pictures in PNG format are named as "2018041112_coldfont_0. PNG", "2018041112_coldfont_1. PNG", "2018041112_coldfont_2. PNG" and "2018041112_coldfont_3. PNG", respectively.
S5, generating a cold front identification data set by utilizing the RGB image data set in the step S3 and the cold front label set in the step S4, inputting the data set into a DETR model for training and testing, and obtaining a cold front identification result, wherein the specific contents are as follows:
S501, putting 2174 RGB images in the JPG format in 2017-2019 into a "train2017" folder, randomly extracting 20% of pictures from the "train2017" by using a random library of Python, and putting the pictures into a "val2017" folder to generate a training set and a verification set of a COCO format data set. The corresponding JSON format data description files "instances _track 2017.JSON" and "instances _val2017.JSON" are generated from the pictures in the training set and the validation set by using the pycococreatortools library of Python, and placed in a "annotations" folder, wherein the class of the data set is set to be a "coldfront" class. The above "train2017", "val2017" and "annotations" folders are combined into a COCO-format cold front dataset.
S502, performing distributed training of a DETR model (https:// github. Com/facebookresearch/DETR) on three GPUs with 12GB memory, using ResNet pre-training weights based on a COCO format dataset, and setting the following training parameters: the target detection stage sets num_ classes =2, batch_size=2, lr_drop= 5,l =1e-4, weight_decay=1e-4, dropout=0.1, generates a prediction frame where a cold front may exist, and the target segmentation stage sets a freezing weight-frozen _ WEIGHTS WEIGHTS/checkpointer 0039.Pth and adds a mask header to perform a target segmentation task.
S503, drawing RGB images, setting T.Resize (800) and T.normal (0.485, 0.456, 0.406) of transformation, 0.229, 0.224, 0.225, setting pretrained =false, num_ classes =2, returning_post processor=true, and selecting 190 rounds of weight checkpoints 0189.Pth for testing to obtain a 2005-2022 satellite cloud image cold front dataset.
Fig. 8 shows a comparison of cold front (black line) and cold front label set (gray line) identified at 2017, 9, 18, and 12:00 (coordinated universal time), and it can be seen that the cold front identified by the present invention corresponds exactly to the cold front of the label set in both line position and morphology.
Fig. 9 shows cold front test results (black line) of 2022, 5, 6, 00:00 (coordinated universal time) and 2021, 4, 18, 00:00 (coordinated universal time), fig. 9 (a) and 9 (b) are superimposed with infrared cloud images of weather satellite FY-2, filled-in portions of fig. 9 (c) and 9 (d) represent 850hPa cold advection, arrows represent ground 10m wind field, solid lines of contour lines represent average sea level air pressure field, and broken lines represent 850hPa temperature field. It can be seen that the cold fronts identified by the invention are all corresponding to the cloud system, are positioned in front of the 850hPa cold advection area and are approximately parallel to the 850hPa isotherm, the maximum cyclone curvature of the isotherm exists on the ground map, and the front and the rear of the fronts show the turning of the ground 10m wind from the north wind to the south wind. From fig. 8 and fig. 9, it can be seen that the deep learning method for identifying the cold front by using the satellite cloud image provided by the invention can accurately identify the position and the form of the cold front.
The embodiment of the invention also provides a deep learning system for identifying the cold front by using the satellite cloud image, which comprises a data acquisition module, an image preprocessing module, an RGB image data set acquisition module, a cold front label set manufacturing module, a cold front identification acquisition module and a computer program capable of running on a processor. It should be noted that each module in the above system corresponds to a specific step of the method provided by the embodiment of the present invention, and has a corresponding functional module and beneficial effect of executing the method. Technical details not described in detail in this embodiment may be found in the methods provided in the embodiments of the present invention.
The embodiment of the invention also provides an electronic device which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor. It should be noted that, when executing the computer program, the processor corresponds to the specific steps of the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method. Technical details not described in detail in this embodiment may be found in the methods provided in the embodiments of the present invention.
The embodiment of the invention also provides a computer readable storage medium, and the computer readable storage medium stores a computer program. It should be noted that, when the computer program is executed by the processor, the specific steps of the method provided by the embodiment of the present invention are corresponding to the functional modules and beneficial effects of the execution method. Technical details not described in detail in this embodiment may be found in the methods provided in the embodiments of the present invention.
The foregoing describes in detail preferred embodiments of the present invention. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the invention by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.

Claims (5)

1. A deep learning method for identifying cold front using satellite cloud images, comprising:
s1, acquiring meteorological data, calculating 850hPa temperature advection, and drawing a satellite infrared cloud image and an meteorological element image, wherein the specific contents are as follows:
S101, selecting a sea level air pressure field, a ground 10m wind field, a 850hPa temperature field and a wind field, setting spatial resolution, downloading ERA-5 data and satellite full-disk nominal image files, selecting a range of 40-160 DEG E and a range of 10-66 DEG N, and calculating 850hPa temperature advection, wherein the specific formula is as follows:
where u denotes a weft horizontal wind of 850hPa, v denotes a warp horizontal wind of 850hPa, The composite wind of u and v, x represents the weft direction, y represents the warp direction, T represents the temperature value obtained by smoothing the temperature of 850hPa, and/(o)Representing the gradient of T;
s102, drawing a satellite infrared cloud chart, a sea level barometric chart and a 850hPa temperature flat chart by using a single color chart;
S2, preprocessing the satellite infrared cloud picture, wherein the specific content is as follows:
S201, dividing the satellite infrared cloud image into a training set and a verification set according to a set proportion, manufacturing a cloud area label, generating a corresponding annotation file and a file of object class numbers in the label set, setting the class numbers as one class, and obtaining an infrared cloud image cloud area data set;
s202, setting a data set category, a loss function, the number of batch samples and a learning rate, training on a U-Net model by using a pre-training weight until a loss function curve is converged, stopping, storing a weight file, and inputting an infrared cloud image cloud region data set in the step S201 into the trained U-Net model to obtain a preliminary segmentation result of a cloud image;
S203, reading the preliminary segmentation map, and respectively marking each independent cloud zone in the map; judging the independent cloud zone comprises the following contents:
Traversing each grid point in the preliminary segmentation graph, if the gray value of the adjacent grid points is not 0, judging the grid points as the connected grid points, marking the same numerical value, continuing to expand until all grid points which are not 0 around the grid points are marked, and obtaining a connected area with all the internal grid points marked with the same numerical value, wherein the connected area is an independent cloud area;
S204, reading a gray level map and a cloud area block file of the satellite cloud map, selecting a time, reading the position of each independent cloud area, corresponding the position to a grid point of the gray level map of the satellite cloud map, extracting the highest gray level value of each independent cloud area and the average gray level value of the cloud area as area representative gray level values, and acquiring screening weights according to the numerical range of the gray level values and the seasonal standard of the values;
S205, selecting a time, processing the image according to the screening weight, reassigning the grid points with gray values smaller than the screening weight in each independent cloud area to be 0, keeping the original pixel values of the rest grid points, and reassigning all independent cloud areas in the time into a graph according to the original position to obtain a preprocessing result of the FY-2 infrared cloud graph of the weather satellite of the second cloud;
s3, reading gray images of the preprocessing cloud image, the meteorological element image and the 850hPa temperature flat flow image, respectively placing the gray images into R, G, B channels of RGB images, and superposing the gray images into an RGB image data set;
s4, utilizing the meteorological data in the step S1, 850hPa temperature advection and the preprocessing cloud image in the step S2 to manufacture a cold front label set, wherein the specific contents are as follows:
s401, drawing a cold front analysis element diagram corresponding to the time of RGB images according to ERA-5 sea level air pressure field, ground 10m air field, 850hPa temperature field and air field data, wherein the diagram comprises a cold front analysis element diagram of 850hPa temperature advection, 850hPa temperature, average sea level air pressure and ground 10m air field;
S402, in a cold front analysis element diagram and a preprocessing cloud diagram, finding out a grid point which is corresponding to the cold front cloud system and is positioned in front of a 850hPa cold advection area, has the maximum cyclone curvature of a contour line on a ground diagram, has the steering of the ground 10m wind from the north wind to the south wind and is approximately parallel to the 850hPa isotherm, marking the grid point as a cold front grid point, and connecting the grid point as an independent and complete cold front thick line;
s403, extracting cold front thick lines, setting grid points determined to be cold front as white, setting other grid points as black, and obtaining a black-white binary image;
S404, reading black-and-white binary images, traversing all lattice points, judging one connected area as an independent frontal line, extracting the positions of thick lines of independent cold fronts, storing each independent frontal line in a picture in a stripe manner according to the corresponding positions, processing the images into black-and-white binary images, and obtaining cold front label pictures to obtain a cold front label set;
S5, generating a cold front identification data set by utilizing the RGB image data set in the step S3 and the cold front label set in the step S4, inputting the data set into the DETR model for training and testing, and obtaining a cold front identification result.
2. The deep learning method for cold front identification using satellite cloud image as claimed in claim 1, wherein in step S5, obtaining cold front identification result includes the following sub-steps:
S501, randomly extracting the RGB image data set in the step S3 into a training set and a verification set according to a set proportion, generating a corresponding annotation JSON file by using a cold front label set, and combining the training set, the verification set and the annotation file to obtain a cold front identification data set;
S502, inputting a cold front identification data set into a DETR model, setting data set types, loss functions, batch sample number, learning rate and weight attenuation, training on a GPU by using a pre-training weight, generating a prediction frame with a cold front, and finishing target detection; setting freezing weight, adding a mask head at the top of target detection output, and continuing training to finish target segmentation;
S503, training the DETR model until the loss function curve converges, stopping, storing weights, setting the weights of parameters, and inputting RGB images generated at the time of selection into the trained DETR model to obtain a cold front recognition result.
3. A system for applying to a deep learning method for identifying cold fronts using satellite cloud images as set forth in claim 1, comprising:
The data acquisition module is used for acquiring meteorological data, calculating 850hPa temperature advection and drawing satellite infrared cloud pictures and gas pixel pictures;
the image preprocessing module is used for preprocessing the satellite infrared cloud pictures in the data acquisition module;
The RGB image data set acquisition module is used for reading the gray images of the preprocessed satellite infrared cloud image, the meteorological element image and the 850hPa temperature flat flow image, respectively placing the gray images into R, G, B channels of the RGB image, and superposing the gray images into an RGB image data set;
The cold front label set manufacturing module is used for manufacturing the cold front label set by utilizing meteorological data in the data acquisition module, 850hPa temperature advection and a preprocessed satellite infrared cloud image in the image preprocessing module;
the cold front identification acquisition module is used for inputting the RGB image data set in the RGB image data set acquisition module and the cold front label set in the cold front label set making module into the DETR model for training and testing to obtain the cold front identification result.
4. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any one of claims 1 to 2 when the computer program is executed by the processor.
5. A computer-readable storage medium, having stored thereon a computer program, characterized in that the computer program, when executed by a processor, performs the method of any of claims 1 to 2.
CN202410256898.2A 2024-03-07 2024-03-07 Deep learning method and system for identifying cold front by using satellite cloud image Active CN117853949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410256898.2A CN117853949B (en) 2024-03-07 2024-03-07 Deep learning method and system for identifying cold front by using satellite cloud image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410256898.2A CN117853949B (en) 2024-03-07 2024-03-07 Deep learning method and system for identifying cold front by using satellite cloud image

Publications (2)

Publication Number Publication Date
CN117853949A CN117853949A (en) 2024-04-09
CN117853949B true CN117853949B (en) 2024-05-14

Family

ID=90548226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410256898.2A Active CN117853949B (en) 2024-03-07 2024-03-07 Deep learning method and system for identifying cold front by using satellite cloud image

Country Status (1)

Country Link
CN (1) CN117853949B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106950612A (en) * 2017-03-14 2017-07-14 天津大学 It is a kind of to be used for automatic identification and the method for drawing cold front in meteorology
CN111414991A (en) * 2020-02-21 2020-07-14 中国人民解放军国防科技大学 Meteorological frontal surface automatic identification method based on multivariate regression
CN112765832A (en) * 2021-02-02 2021-05-07 南京信息工程大学 Automatic identification and correction method for continental europe
CN114565056A (en) * 2022-03-15 2022-05-31 中科三清科技有限公司 Machine learning-based cold-front identification method and device, storage medium and terminal
CN115082791A (en) * 2022-06-22 2022-09-20 中国人民解放军国防科技大学 Meteorological frontal surface automatic identification method based on depth separable convolutional network
CN115878731A (en) * 2022-11-17 2023-03-31 南京信息工程大学 Automatic warm-spike identification method
CN116030401A (en) * 2023-03-28 2023-04-28 南京信息工程大学 Deep learning-based European and Asian region cold front automatic identification method
CN116577844A (en) * 2023-03-28 2023-08-11 南京信息工程大学 Automatic east Asia cold front precipitation identification method and system
CN116681959A (en) * 2023-06-09 2023-09-01 中科三清科技有限公司 Machine learning-based frontal line identification method and device, storage medium and terminal
WO2024021225A1 (en) * 2022-07-29 2024-02-01 知天(珠海横琴)气象科技有限公司 High-resolution true-color visible light model generation method, high-resolution true-color visible light model inversion method, and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106950612A (en) * 2017-03-14 2017-07-14 天津大学 It is a kind of to be used for automatic identification and the method for drawing cold front in meteorology
CN111414991A (en) * 2020-02-21 2020-07-14 中国人民解放军国防科技大学 Meteorological frontal surface automatic identification method based on multivariate regression
CN112765832A (en) * 2021-02-02 2021-05-07 南京信息工程大学 Automatic identification and correction method for continental europe
CN114565056A (en) * 2022-03-15 2022-05-31 中科三清科技有限公司 Machine learning-based cold-front identification method and device, storage medium and terminal
CN115082791A (en) * 2022-06-22 2022-09-20 中国人民解放军国防科技大学 Meteorological frontal surface automatic identification method based on depth separable convolutional network
WO2024021225A1 (en) * 2022-07-29 2024-02-01 知天(珠海横琴)气象科技有限公司 High-resolution true-color visible light model generation method, high-resolution true-color visible light model inversion method, and system
CN115878731A (en) * 2022-11-17 2023-03-31 南京信息工程大学 Automatic warm-spike identification method
CN116030401A (en) * 2023-03-28 2023-04-28 南京信息工程大学 Deep learning-based European and Asian region cold front automatic identification method
CN116577844A (en) * 2023-03-28 2023-08-11 南京信息工程大学 Automatic east Asia cold front precipitation identification method and system
CN116681959A (en) * 2023-06-09 2023-09-01 中科三清科技有限公司 Machine learning-based frontal line identification method and device, storage medium and terminal

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A Mediterranean cold front identification scheme combining wind and thermal criteria;Evangelia Bitsa 等;《International Journal of Climatology》;20210517;6497-6510 *
A method for identifying warm fronts in Eurasia;Yujing Qin 等;《Research Square》;20231004;1-33 *
基于人工神经网络的GMS云图四类云系的识别;白慧卿 等;《应用气象学报》;19981130;第9卷(第4期);402-409 *
基于客观识别的欧亚大陆近三十年冬半年冷锋活动特征;冯梦茹;《中国优秀硕士学位论文全文数据库基础科学辑》;20220115;A009-140 *

Also Published As

Publication number Publication date
CN117853949A (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN109598241B (en) Satellite image marine ship identification method based on Faster R-CNN
CN111985376A (en) Remote sensing image ship contour extraction method based on deep learning
CN106909902B (en) Remote sensing target detection method based on improved hierarchical significant model
CN112084869B (en) Compact quadrilateral representation-based building target detection method
Huang et al. Mapping urban areas in China using multisource data with a novel ensemble SVM method
Hormese et al. Automated road extraction from high resolution satellite images
CN111241970A (en) SAR image sea surface ship detection method based on yolov3 algorithm and sliding window strategy
CN111368766A (en) Cattle face detection and identification method based on deep learning
CN107679476A (en) A kind of Sea Ice Types Classification in Remote Sensing Image method
CN110569797A (en) earth stationary orbit satellite image forest fire detection method, system and storage medium thereof
CN112700489B (en) Ship-based video image sea ice thickness measurement method and system based on deep learning
CN114022408A (en) Remote sensing image cloud detection method based on multi-scale convolution neural network
CN111738113A (en) Road extraction method of high-resolution remote sensing image based on double-attention machine system and semantic constraint
CN114898097B (en) Image recognition method and system
CN116416626B (en) Method, device, equipment and storage medium for acquiring circular seal data
CN114140665A (en) Dense small target detection method based on improved YOLOv5
CN113486819A (en) Ship target detection method based on YOLOv4 algorithm
CN114676773A (en) Arctic sea ice classification method based on SAR data
CN106803078A (en) A kind of SAR image Ship Target dividing method
CN114519819B (en) Remote sensing image target detection method based on global context awareness
CN112084860A (en) Target object detection method and device and thermal power plant detection method and device
CN111723814A (en) Cross-image association based weak supervision image semantic segmentation method, system and device
CN109657728B (en) Sample production method and model training method
CN117853949B (en) Deep learning method and system for identifying cold front by using satellite cloud image
CN113298042A (en) Method and device for processing remote sensing image data, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant