CN116703126A - Drainage prediction method and system based on natural drainage partition - Google Patents

Drainage prediction method and system based on natural drainage partition Download PDF

Info

Publication number
CN116703126A
CN116703126A CN202310954996.9A CN202310954996A CN116703126A CN 116703126 A CN116703126 A CN 116703126A CN 202310954996 A CN202310954996 A CN 202310954996A CN 116703126 A CN116703126 A CN 116703126A
Authority
CN
China
Prior art keywords
drainage
sub
output
prediction
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310954996.9A
Other languages
Chinese (zh)
Other versions
CN116703126B (en
Inventor
刘韶鹏
吴连奎
肖捷
王浩帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunlu Technology Co Ltd
Original Assignee
Beijing Yunlu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunlu Technology Co Ltd filed Critical Beijing Yunlu Technology Co Ltd
Priority to CN202310954996.9A priority Critical patent/CN116703126B/en
Publication of CN116703126A publication Critical patent/CN116703126A/en
Application granted granted Critical
Publication of CN116703126B publication Critical patent/CN116703126B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06315Needs-based resource requirements planning or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A10/00TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
    • Y02A10/40Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Marketing (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to a drainage prediction method and a drainage prediction system based on natural drainage partition, wherein the drainage prediction method comprises the following steps: based on the distribution of the natural drainage subareas, the target area is subjected to gradual recursion division to obtain M sub-drainage subareas, and no mutual water quantity transmission exists between two different sub-drainage subareas in the natural drainage subareas; acquiring input files for M; utilizing MPI parallel application program to configure a plurality of processor cores according to the number of input files, so that each input file and the corresponding processor core respectively realize binding operation; based on M input files, calling corresponding processor cores to execute parallel drainage prediction analysis of the SWMM by using the MPI parallel application program, wherein the operation time of the processor cores for the corresponding input files is smaller than a threshold time so as to obtain a drainage prediction result. The drainage prediction method can shorten the operation time of drainage prediction and can also be used for real-time drainage prediction analysis.

Description

Drainage prediction method and system based on natural drainage partition
Technical Field
The application relates to the technical field of drainage prediction and analysis, in particular to a drainage prediction method and system based on natural drainage partition.
Background
SWMM (storm water management model, rain management model) is widely used worldwide for planning analysis and design in connection with stormwater runoff, combined and sewage networks, and other drainage systems. It can be used to evaluate grey infrastructure stormwater control strategies, such as pipes and gutters, etc., and is a useful tool for creating a cost-effective green and grey hybrid stormwater control solution. SWMM is intended to help support local, state, and national rainwater management goals to reduce runoff through penetration and retention, and to help reduce emissions that lead to damage to the body of water. SWMM has been integrated as a mature rain water management model by a large number of commercial software as a core for drainage pipe network computation.
The current SWMM model mainly operates on a single machine, and in order to improve the computing performance and the computing efficiency, the SWMM model uses an OpenMP (Open Multi-Processing) parallel Processing method for computing, and the computing resource of the single machine is furthest used. Taking a local computer as an example, 12 threads are taken, after the number of threads exceeds 8, the calculation efficiency is basically unchanged or even reduced along with the increase of the number of threads, and when the drainage problem of a large-scale pipe network model and a super-large-scale pipe network model is calculated, the time consumption is long and the calculation efficiency is low. In the case of high calculation speed requirements, especially in the case of requiring real-time calculation of the drainage capacity of the pipe network, the calculation efficiency of the SWMM model does not meet the requirements at all.
Disclosure of Invention
The present application has been made to solve the above-mentioned problems occurring in the prior art.
The drainage prediction method and the drainage prediction system based on the natural drainage partition can improve the calculation efficiency of the SWMM model, shorten the calculation time, still reduce the calculation time consumption when calculating the drainage problem of a large-scale pipe network model and a very large-scale pipe network model, and calculate the drainage capacity of the pipe network in real time.
According to a first aspect of the present application, there is provided a drainage prediction method based on natural drainage partition. The drainage prediction method includes the following steps. Based on the distribution of the natural drainage subareas, the target area is subjected to gradual recursion division to obtain M sub-drainage subareas, and no mutual water quantity transmission exists between two different sub-drainage subareas in the natural drainage subareas. And acquiring basic data of the M sub-drainage partitions, and obtaining M input files for the SWMM model based on the basic data. With the MPI (Message passing interface) parallel application, multiple processor cores are configured according to the number of input files such that each input file implements a binding operation with a corresponding processor core, respectively. And based on the M input files, calling corresponding processor cores to execute parallel drainage prediction analysis of the SWMM model by utilizing the MPI parallel application program, wherein the operation time of the processor cores for the corresponding input files is smaller than a threshold time so as to obtain a drainage prediction result.
According to a second aspect of the present application, there is provided a drainage prediction system based on natural drainage partition. The drainage prediction system includes an interface and a processor. The interface is configured to: receiving the basic data of M sub-drainage partitions. The processor is configured to: the drainage prediction method based on natural drainage partition according to various embodiments of the present application is performed.
According to the drainage prediction method and system based on the natural drainage partition, the target area is divided into M sub-drainage partitions according to a stepwise recursion division mode, basic data of the M sub-drainage partitions are respectively obtained, M input files of the SWMM model are respectively obtained, the M sub-drainage partitions are free of water quantity mutual transmission, the M sub-drainage partitions are mutually independent, and therefore the processor cores can independently calculate the input files. M input files are respectively bound with the processor cores by utilizing an MPI parallel application program, so that the SWMM model can obviously shorten the calculation time in the drainage prediction process, improve the calculation efficiency and be beneficial to shortening the drainage prediction time under the condition that a target area is more likely to contain a large-scale or ultra-large-scale pipe network model. The target area can be divided in a step-by-step recursion mode according to the requirement of operation time, the operation time can reach a time interval of tens of seconds, and the method is beneficial to calculating the drainage capacity of the pipe network in real time.
Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The accompanying drawings illustrate various embodiments by way of example in general and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. Such embodiments are illustrative and not intended to be exhaustive or exclusive of the present apparatus or method.
FIG. 1 illustrates a flow chart of a natural drain partition based drain prediction method according to an embodiment of the present application;
FIG. 2 shows a schematic diagram of M input files independent of each other according to an embodiment of the application;
FIG. 3 is a schematic diagram showing M input files respectively binding with corresponding processor cores according to an embodiment of the present application;
FIG. 4 illustrates a flow chart of a drainage prediction method based on natural drainage partition according to another embodiment of the present application;
FIG. 5 shows a schematic diagram of a progressive recursive partitioning of a target area according to an embodiment of the present application; and
fig. 6 illustrates a block diagram of a drainage prediction system based on natural drainage partition according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the drawings and detailed description to enable those skilled in the art to better understand the technical scheme of the present application. Embodiments of the present application will be described in further detail below with reference to the drawings and specific examples, but not by way of limitation.
The terms "first," "second," and the like, as used herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises" and the like means that elements preceding the word encompass the elements recited after the word, and not exclude the possibility of also encompassing other elements. The term "a or more" used in the present application includes a and a number larger than a, and the term "B or less" used herein includes not B but only a number smaller than B. The order in which the steps of the methods described in connection with the figures are performed is not intended to be limiting. As long as the logical relationship between the steps is not affected, several steps may be integrated into a single step, the single step may be decomposed into multiple steps, or the execution order of the steps may be exchanged according to specific requirements.
Fig. 1 illustrates a flowchart of a drainage prediction method based on natural drainage partition according to an embodiment of the present application. In step 101, based on the distribution of the natural drainage partitions, the target area is recursively divided step by step to obtain M sub-drainage partitions, and no mutual water delivery exists between two different sub-drainage partitions in the natural drainage partitions. Urban drainage systems are used for collecting, transporting and draining rainwater. The natural drainage subarea is distributed in a plurality of different drainage areas obtained by dividing the urban drainage system according to natural geographic components, and no mutual water quantity transmission exists between the two different drainage subareas. Based on the distribution of natural drainage subareas, the target area is divided in a stepwise recursion way, so that a drainage system in the target area is divided into smaller sub-drainage subareas, and no mutual water quantity transmission exists between each sub-drainage subarea. The sub-drainage partitions of different levels can be obtained by dividing according to the requirements, for example, the sub-drainage partitions can be divided into the levels of residential communities and the levels of buildings. If the two sub-drainage areas are respectively an A building and a B building, even if the A building and the B building are separated by only one street, the A building and the B building can be respectively different sub-drainage areas because the A building and the B building are provided with independent drainage pipe networks and the water quantity is not interacted. Thus, each input file is operated by each processor core, and the processing speed is improved, so that the drainage prediction result of the sub-drainage partition or the drainage facility which is interested by the user is output.
In step 102, the base data of the M sub-drain partitions is obtained, and M input files for the SWMM model are obtained based thereon. The base data for the sub-drain partition includes parameters of the drain facilities within the sub-drain partition. In some embodiments, the basic data of the sub-drainage partition includes geometric parameters and physical attribute parameters of drainage facilities of the M sub-drainage partitions, the drainage facilities may include inspection wells, pipelines, nodes (branch points or junction points of pipelines), sub-catchments, and the like, and the geometric parameters may include pipeline section dimensions, bottom elevations, and the like of the pipelines, and the physical attribute parameters include viscosity, water head, water depth, and/or the like, for example. FIG. 2 shows a schematic diagram of M input files independent of each other according to an embodiment of the application. In the example shown in fig. 2, the input files of the SWMM model may be in file format with a.inp suffix, and the M input files are 0.inp, 1.inp, 2.inp,..and M-1.inp, respectively. And inputting the basic data in the basic data into the input file according to the format of the SWMM input file according to the obtained basic data.
In step 103, using the MPI parallel application program, a plurality of processor cores are configured according to the number of input files, such that each input file implements a binding operation with a corresponding processor core, respectively. FIG. 3 is a schematic diagram showing binding of M input files with corresponding processor cores, respectively, according to an embodiment of the application. In fig. 3, taking 108 input files as an example, the MPI parallel application program uses the processor cores of 27 computing nodes to perform operations, where 108 input files are respectively bound to 27 computing nodes, and one computing node may represent a server. Specifically, input files 0.inp, 1.inp, 2.inp, and 3.inp may be bound to processor core 1, processor core 2, processor core 3, and processor core 4 of compute node 1 in one-to-one correspondence, respectively, and input files 4.inp, 5.inp, 6.inp, and 7.inp may be bound to processor core 1, processor core 2, processor core 3, and processor core 4 of compute node 2 in one-to-one correspondence, respectively, until input files 104.inp, 105.inp, 106.inp, and 107.inp may be bound to processor core 1, processor core 2, processor core 3, and processor core 4 of compute node 27 in one-to-one correspondence, respectively. Thus, when the SWMM model is running, each processor core calculates the corresponding input file respectively, and the time for each processor core to calculate is short, so that the operation time of drainage prediction is shortened.
In some embodiments, the content in the input file may be organized by sections, each section having keywords at its beginning. The keywords may include the following: [ TITLE ] -TITLE, [ OPTIONS ] -OPTIONS information, [ REPORT ] -output information, [ JUNCTINS ] -manhole, [ CONDUITS ] -pipe, [ SUBCATCHMENTS ] -sub-catchment area, etc. Keywords may appear in a case-to-case mixture. The various sections may appear in any order in the input file, not all sections necessarily appearing. Each section may contain one or more lines of data, and empty lines may be anywhere in the file. The beginning of the semicolon (;) indicates that the row is an annotation, not data, and that the data item may appear in any column in the row. So that the SWMM model recognizes and computes the input file.
In step 104, based on the M input files, each corresponding processor core is invoked to execute parallel drainage prediction analysis of the SWMM model by using the MPI parallel application program, wherein the operation time of each processor core for the corresponding input file is less than a threshold time, so as to obtain a drainage prediction result. Compared with single machine calculation, the method has the advantages that the time consumption is greatly reduced, the method takes the pipeline number as an example, if the pipeline number of the whole area is ten thousands, the whole area is divided into 10 areas, the pipeline number of one area is about 1000, the operation time is about 8-10min if the single machine calculation mode is utilized, and the operation time is about 30s if the drainage prediction method is utilized. Therefore, the drainage prediction method of the embodiment of the application can shorten the calculation time of drainage prediction in drainage prediction analysis of large-scale or ultra-large-scale areas. And the drainage prediction result can be output in real time with shorter operation time (such as a time interval of tens of seconds), which is beneficial to calculating the drainage capacity of the pipe network in real time.
The level degree of the sub-drainage partition divided by the target area is different, the operation time is also different, the level degree of the divided sub-drainage partition can be determined according to the requirement of the operation time, and the threshold time can be set according to the requirement of the operation time.
In some embodiments, the MPI parallel application is initialized, and transfer parameters are configured at the command line at runtime after compilation to specify and transfer the number of multiple processor cores into the MPI parallel application. This facilitates the MPI parallel application to operate with a number of processor cores corresponding to the number of input files.
In some embodiments, the binding operation implemented by each input file and the corresponding processor core specifically includes: numbering the designated processing processes of the plurality of processor cores so that the M input files correspond one-to-one to each process number for each time step: and each processor core reads the input file of the corresponding process number, and processes the input file by utilizing the process corresponding to the process number to obtain a drainage prediction result.
The time step may be a simulation calculation time of the SWMM model, for example, the simulation calculation time of the SWMM model is 10s, and the SWMM model can obtain the drainage prediction result every 10 s. The analog computation time can be set with the computed stability. For example, the process number corresponding to 0.inp is rank=0, the process number corresponding to 1.inp is rank=1, and the like, each input file corresponds to a process number one by one, when the SWMM model operates, the process of rank=0 corresponds to the operation input file 0.inp, and the process of rank=1 corresponds to the operation input file 1.inp. Through the binding process of the prior process number, each input file can be respectively operated by the corresponding process, so that the operation speed of the SWMM model is increased, and the operation efficiency is improved.
In some embodiments, the drainage prediction method further comprises the steps of: and receiving a configuration file set by a user. And identifying the configuration file to obtain the customized requirement. And according to the customized demand of the user, obtaining a drainage prediction result corresponding to the customized demand by using an interface function of the swmm dynamic link library. The user may input the pipe number, name, etc. through the profile in advance. And the relevant parameters of the customization demands are obtained by identifying the configuration file, and the corresponding results can be called in the program calculation process by utilizing the interface function of the swmm dynamic link library according to the relevant parameters of the customization demands. In addition, the result obtained by the SWMM model operation can be written into a binary file, and a report output file can also be generated. The interface function of the swmm dynamic link library can also call the corresponding result according to the binary file. In this way, the data processing work of the result can be pre-arranged through the process of customizing the output, so that the subsequent complicated data screening work based on binary files or output files by a user is reduced.
In some embodiments, according to the customized demand of the user, using the interface function of the swmm dynamic link library to obtain the drainage prediction result corresponding to the customized demand specifically includes the following steps:
the customized demand includes a first control condition including an output object including a part of the manhole, the pipe, the node, and the sub-catchment area of the target area or all of the manhole, the pipe, the node, and the sub-catchment area of the target area, and an output parameter including the manhole, the pipe, the node (a branching point or a junction of the pipe), and the sub-catchment area. And matching the output object and the output parameter with the process number. And under the condition of at least two output objects, reorganizing the interface functions of the swmm dynamic link library corresponding to different output parameters so as to call out the drainage prediction results of the output parameters of all the output objects included in the customized demand.
Regarding the first control condition, the customized demand may include output parameters of different pipelines or nodes of the same sub-drainage partition, and may further include output parameters of different pipelines or nodes of different sub-drainage partitions, and corresponding process numbers may be obtained according to the identified pipeline numbers or names, so that the interface function may conveniently call the operation result of the corresponding process. When the output parameters are multiple, for example, the parameters may include the parameters a and B of the pipeline, the parameters C and D of the node, and so on, so that the interface functions of different output parameters need to be called, the interface functions may form a combination according to the order of the requirements in the configuration file, and when the result is output, each interface function may be called in turn, so as to obtain the drainage prediction result of the output parameters of all the corresponding output objects. Therefore, the output results can be respectively and correspondingly output to a plurality of output objects and output parameters appointed by the user, and the complicated work of correspondingly searching in the binary output file or the report output file by the user is reduced. In addition, the result is directly fetched in the operation process, the process of writing in the output file can be omitted, and the waste of storage space is avoided.
In some embodiments, the customized demand includes a second control condition including an output time and/or an output frequency, the output time being a time of an output result in the drainage prediction process, the output frequency being a number of times the output result in the drainage prediction process, so that the SWMM model outputs the drainage prediction result based on the output time and/or the output frequency. In this way, the drainage prediction result corresponding to the time designated by the user can be obtained, taking calculation as an example, the starting time and the ending time can be respectively 12:00:00 and 13:00:00, the output time can be respectively 12:05:00, 12:30:00, 12:32:00 and the like, which may be the drainage prediction results of important time, and the drainage prediction results of other times not designated by the user do not need to be output, for example, the drainage prediction results do not need to be output at the time of 12:01:00. The time interval between the output time and the starting time can be regarded as a time step, taking the output time of 12:05:00 as an example, the time step is 5 minutes, the output result of 12:05:00 is a drainage prediction result simulating 5 minutes, the operation time of drainage prediction may be 30s, and then the output result of 12:05:00 can be selected. The output frequency may be the number of times the result is output in the course of calculating one hour, for example, 10 times the result is controlled to be output between 12:00:00 and 13:00:00, so as to fully grasp the drainage prediction condition at the intermediate time point within 1 hour of the SWMM model operation. If the user needs to output the drainage prediction result in real time, the output time can be determined at intervals of tens of seconds (such as 40s or 50s, etc.), so that the user side can see the real-time dynamic result.
In some embodiments, the customization requirements may also include a format of the drainage predictions, such as output text format (.txt), chart format (.csv), and so forth. After the interface function invokes the corresponding result, the data format can be processed through a subsequent processing module, so that the user can check conveniently.
Fig. 4 illustrates a flowchart of a drainage prediction method based on natural drainage partition according to another embodiment of the present application. The drainage prediction method further comprises the following steps: in step 401, a time step is determined, as well as a start time and an end time of the drain prediction.
In step 402, the SWMM model performs a processing operation based on the time step to obtain a drainage prediction result. For example, when the SWMM model is set to be operated at 12:00:00 and the time step is 10s, the SWMM model obtains the drainage prediction result every ten seconds from 12:00:00.
Based on the drain prediction result of each time step, it is determined whether the first control condition and the second control condition are satisfied in step 403. In the case that the control conditions include an output object, an output parameter, an output time and an output frequency, if the B parameter of the a pipeline in the customized demand is determined to have been obtained when the timing interval is reached in the SWMM model operation process, and the output time specified by the user is reached, the first control condition and the second control condition can be determined to be satisfied when the requirement of the output frequency can also be satisfied. If the determination in step 403 is yes, step 404 is performed to output a drainage prediction result.
If the result of the determination in step 403 is no or after the drainage prediction result is output, step 405 is performed to determine whether the current time reaches the termination time. If the current time is 12:30:00 and the ending time is 13:00:00, the judgment result of step 405 is no, and the next simulation operation process is circularly performed. If the result of the determination in step 405 is yes, step 406 is entered, and the calculation ends. At this point the MPI parallel application program stops performing calculations. In this way, in the calculation process of the starting time and the ending time, the corresponding drainage prediction result is timely output based on the customized requirement of the user.
In some embodiments, the start time, end time, and timing interval of the drain prediction may be configured in the input file, respectively. For example, it may be configured in the content of the output information with the keyword [ REPORT ].
In some embodiments, the step-by-step recursively dividing the target area into M drainage partitions specifically includes the following steps: based on the distribution of urban drainage subareas, the target area is divided into N through a first level 1 The first level sub-drainage partitions a11, a12, A1N. For the N 1 The first-stage drainage partitions are respectively divided into N stages through a second stage 2 The two secondary sub-drainage partitions a21, a22, A2X, until the nth level division yields N N N level N sub-drainage partitions AN1, AN2, ANX, wherein the N is 1 Between two different primary sub-drainage partitions of the primary sub-drainage partitions, the N 2 Between two different secondary sub-drainage partitions of the secondary sub-drainage partitions and said N N There is no mutual transfer of water between two different N-stage sub-drainage sub-sections of the N-stage sub-drainage sub-sections. The sub-drainage partitions of different levels can be divided according to the requirements of operation time and the like, and if the sub-drainage partitions are divided into lower levels, the operation time of the MPI parallel application program is reduced, so that a drainage prediction result can be obtained with proper operation time.
Fig. 5 shows a schematic diagram of a progressive recursive partitioning of a target area according to an embodiment of the present application. The method of dividing the target area shown in fig. 5 will be described as an example: a represents the whole target area a without division, b represents the target area after the first level division, wherein the first level division divides the target area into four level sub-drainage partitions (a 11, a12, a13, a 14). c represents a target area after the second level division, wherein each first level sub-drainage partition is divided into four second level sub-drainage partitions, and the obtained second level sub-drainage partitions comprise A201-A216.d represents the target area after the third level division, wherein each secondary sub-drainage partition is subdivided into four tertiary sub-drainage partitions, and the obtained tertiary sub-drainage partitions comprise A301-A364. The three level sub-drain partition may also continue to be partitioned next. It is understood that in the first-stage division, the second-stage division, and up to the nth-stage division, the number of sub-drainage partitions after each level division may be divided according to the distribution of the drainage partitions of the target area or according to the operation time requirement, etc. The number of divisions per level may be different, for example, a first division dividing the target area into four first-level sub-drain divisions, a second division dividing the first-level sub-drain division into 2 second-level sub-drain divisions, a second-level sub-drain division into 2 second-level sub-drain divisions, a third-level sub-drain division into 3 second-level sub-drain divisions, and a fourth-level sub-drain division into 4 second-level sub-drain divisions. The method can be divided according to the conditions of different sub-drainage partitions, and is beneficial to drainage prediction and analysis. The division is performed according to the operation time, if the demand of the operation time is lower, the more the layers can be divided, the smaller the obtained self-draining partition is, so that the operation fineness can be improved, for example, the division into five-level sub-draining partitions is realized. If the demand of the operation time is higher, the fewer the layers can be divided, the larger the self-draining partition is obtained, for example, the three-level sub-draining partition is divided, and the operation time can be shortened.
In some embodiments, where the output object is an inspection well, the output parameters include water depth, head, and overflow volume; in the case that the output object is a pipeline, the output parameters include flow, flow rate and fullness; in the case that the output object is a node, the output parameters include a flow rate and a flow velocity; in the case that the output object is a sub-catchment area, the output parameters include rainfall, evaporation, infiltration, and runoff. In the case where the output objects are different, the output parameters may also be different. The output parameters help to embody the drainage prediction result.
Fig. 6 illustrates a block diagram of a drainage prediction system based on natural drainage partition according to an embodiment of the present application. The drainage prediction system 600 includes an interface 601 and a processor 602. The interface 601 is configured to: receiving the basic data of M sub-drainage partitions. The processor 602 is configured to: a drainage prediction method according to any one of the embodiments of the present application is performed. The drainage prediction system 600 of the embodiment of the application can be applied to drainage prediction analysis of large-scale or ultra-large-scale areas, can shorten the operation time of drainage prediction, can output a drainage prediction result once with shorter operation time (such as a time interval of tens of seconds), and is beneficial to calculating the drainage capacity of a pipe network in real time.
A processor in the present application may be a processing device, such as a microprocessor, central Processing Unit (CPU), graphics Processing Unit (GPU), or the like, that includes more than one general purpose processing device. More specifically, the processor may be a Complex Instruction Set Computing (CISC) microprocessor, a Reduced Instruction Set Computing (RISC) microprocessor, a Very Long Instruction Word (VLIW) microprocessor, a processor running other instruction sets, or a processor running a combination of instruction sets. The processor may also be one or more special purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a system on a chip (SoC), or the like. The processor may be communicatively coupled to the memory and configured to execute computer-executable instructions stored thereon.
Furthermore, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of the various embodiments across), adaptations or alterations as pertains to the present application. The elements in the claims are to be construed broadly based on the language employed in the claims and are not limited to examples described in the present specification or during the practice of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the application. This is not to be interpreted as an intention that the disclosed features not being claimed are essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with one another in various combinations or permutations. The scope of the application should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are only exemplary embodiments of the present application and are not intended to limit the present application, the scope of which is defined by the claims. Various modifications and equivalent arrangements of this application will occur to those skilled in the art, and are intended to be within the spirit and scope of the application.

Claims (11)

1. A drainage prediction method based on natural drainage partition, comprising:
based on the distribution of natural drainage subareas, obtaining M sub-drainage subareas through gradual recursion division of a target area, wherein no mutual water quantity transmission exists between two different sub-drainage subareas in the natural drainage subareas;
acquiring basic data of the M sub-drainage partitions, and obtaining M input files for the SWMM model based on the basic data;
utilizing MPI parallel application program to configure a plurality of processor cores according to the number of input files, so that each input file and the corresponding processor core respectively realize binding operation;
and based on the M input files, calling corresponding processor cores to execute parallel drainage prediction analysis of the SWMM model by utilizing the MPI parallel application program, wherein the operation time of the processor cores for the corresponding input files is smaller than a threshold time so as to obtain a drainage prediction result.
2. The drainage prediction method according to claim 1, wherein configuring the plurality of processor cores according to the number of input files using the MPI parallel application program specifically includes: the MPI parallel application is initialized and transfer parameters are configured at the command line at runtime after compilation to specify and transfer the number of processor cores into the MPI parallel application.
3. The drainage prediction method according to claim 1, wherein the binding operation between each input file and the corresponding processor core specifically includes: numbering the designated processing processes of the plurality of processor cores so that the M input files correspond one-to-one to each process number for each time step:
and each processor core reads the input file of the corresponding process number, and processes the input file by utilizing the process corresponding to the process number to obtain a drainage prediction result.
4. The drainage prediction method according to claim 3, further comprising:
receiving a configuration file set by a user;
identifying the configuration file to obtain a customized requirement;
and according to the customized demand of the user, obtaining a drainage prediction result corresponding to the customized demand by using an interface function of the swmm dynamic link library.
5. The drainage prediction method according to claim 4, wherein the step of obtaining the drainage prediction result corresponding to the customized demand by using the interface function of the swmm dynamic link library according to the customized demand of the user specifically comprises:
the customized demand comprises a first control condition, wherein the first control condition comprises an output object and an output parameter, the sub-drainage partition comprises an inspection well, a pipeline, a node and a sub-catchment area, and the output object comprises a part of the inspection well, the pipeline, the node and the sub-catchment area of a target area or all the inspection well, the pipeline, the node and the sub-catchment area of the target area;
matching the output object and the output parameter with the process number;
and under the condition of at least two output objects, reorganizing the interface functions of the swmm dynamic link library corresponding to different output parameters so as to call out the drainage prediction results of the output parameters of all the output objects included in the customized demand.
6. The drainage prediction method according to claim 5, wherein the customized demand includes a second control condition including an output time and/or an output frequency, the output time being a time of an output result in the drainage prediction process, the output frequency being a number of times of the output result in the drainage prediction process, so that the SWMM model outputs the drainage prediction result based on the output time and/or the output frequency.
7. The drainage prediction method according to claim 6, further comprising:
determining the time step and the starting time and the ending time of drainage prediction;
the SWMM model carries out processing operation based on the time steps to obtain a drainage prediction result;
and outputting the drainage prediction result based on the drainage prediction result obtained in each time step under the condition that the first control condition and the second control condition are met.
8. The drainage prediction method according to claim 1, wherein the step-wise recursively dividing the target area into M drainage partitions specifically comprises:
based on the distribution of urban drainage subareas, the target area is divided into N through a first level 1 The first level sub-drain sections a11, a12, & A1N;
for the N 1 The first-stage drainage partitions are respectively divided into N stages through a second stage 2 The two secondary sub-drainage partitions a21, a22, A2X, until the nth level division yields N N N-level sub-Drainage partition AN1, AN2,..and ANX, wherein the N 1 Between two different primary sub-drainage partitions of the primary sub-drainage partitions, the N 2 Between two different secondary sub-drainage partitions of the secondary sub-drainage partitions and said N N There is no mutual transfer of water between two different N-stage sub-drainage sub-sections of the N-stage sub-drainage sub-sections.
9. The drainage prediction method of claim 1, wherein the basis data of the M sub-drainage partitions includes geometrical parameters and physical attribute parameters of drainage facilities of the M sub-drainage partitions.
10. The drainage prediction method according to claim 5, wherein in the case where the output object is an inspection well, the output parameters include a water depth, a water head, and an overflow amount; in the case that the output object is a pipeline, the output parameters include flow, flow rate and fullness; in the case that the output object is a node, the output parameters include a water depth and a water head; in the case that the output object is a sub-catchment area, the output parameters include rainfall, evaporation, infiltration, and runoff.
11. A drainage prediction system based on natural drainage partition, comprising:
an interface configured to: receiving basic data of M sub-drainage partitions;
a processor configured to: the drainage prediction method according to any one of claims 1 to 10 is performed.
CN202310954996.9A 2023-08-01 2023-08-01 Drainage prediction method and system based on natural drainage partition Active CN116703126B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310954996.9A CN116703126B (en) 2023-08-01 2023-08-01 Drainage prediction method and system based on natural drainage partition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310954996.9A CN116703126B (en) 2023-08-01 2023-08-01 Drainage prediction method and system based on natural drainage partition

Publications (2)

Publication Number Publication Date
CN116703126A true CN116703126A (en) 2023-09-05
CN116703126B CN116703126B (en) 2023-11-14

Family

ID=87824252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310954996.9A Active CN116703126B (en) 2023-08-01 2023-08-01 Drainage prediction method and system based on natural drainage partition

Country Status (1)

Country Link
CN (1) CN116703126B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480715B1 (en) * 1997-07-25 2002-11-12 Nokia Networks Oy Recovering group data between mobile systems
CN107563019A (en) * 2017-08-17 2018-01-09 河北工程大学 One kind is directed to Complex Underlying Surface city hydrological distribution model digitizing solution
CN112700141A (en) * 2020-12-30 2021-04-23 哈尔滨工业大学 Online analysis method for municipal drainage pipe network
CN116151013A (en) * 2023-02-28 2023-05-23 长江勘测规划设计研究有限责任公司 Method for pushing out design flood of small-river-basin urban river channel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480715B1 (en) * 1997-07-25 2002-11-12 Nokia Networks Oy Recovering group data between mobile systems
CN107563019A (en) * 2017-08-17 2018-01-09 河北工程大学 One kind is directed to Complex Underlying Surface city hydrological distribution model digitizing solution
CN112700141A (en) * 2020-12-30 2021-04-23 哈尔滨工业大学 Online analysis method for municipal drainage pipe network
CN116151013A (en) * 2023-02-28 2023-05-23 长江勘测规划设计研究有限责任公司 Method for pushing out design flood of small-river-basin urban river channel

Also Published As

Publication number Publication date
CN116703126B (en) 2023-11-14

Similar Documents

Publication Publication Date Title
US20190301963A1 (en) Processing data for predicting pipe failure
CN102890792A (en) Municipal drainage pipe network decision evaluation method
CN113569524B (en) Method for extracting clock tree based on comprehensive netlist in chip design and application
CN111428350A (en) Method and device for constructing water environment detection coupling model and computer equipment
CN110766792B (en) Grid calculation order coding method based on ArcGIS hydrological analysis tool
US7594203B2 (en) Parallel optimization using independent cell instances
CN116703126B (en) Drainage prediction method and system based on natural drainage partition
CN104573864A (en) Data analysis alarm method based on autoregressive prediction
KR101567540B1 (en) Failure probability adjustment type gis-aided sewerage asset management method
Jin et al. Development of a GIS-based expert system for on-site storm-water management
CN113239502B (en) Artificial intelligence image processing-based urban sewage pipe network simulation construction method
CN100403319C (en) Urban rainwater pipeline design method
CN115204846A (en) Rain and sewage mixed flow inspection well searching method and system based on three-dimensional pipeline model
CN111369102B (en) Method and device for extracting waterlogging risk points
KR101567538B1 (en) Business risk adjustment type gis-aided waterworks asset management method
CN113496305A (en) Data processing method and device
CN107741925B (en) Automatic conversion method and device for underground pipeline survey data
CN113609812A (en) LightGBM-based netlist-level line delay prediction method, device and medium
CN117575375A (en) Urban drainage pipe network flow estimation method and device and terminal equipment
CN113886999B (en) Method for quickly establishing drainage model and GIS (geographic information System) data through CAD (computer aided design) data
CN117828312B (en) Method for managing watershed hydrologic environment and related equipment
CN113920719B (en) Traffic tracing method and electronic equipment
CN118114842A (en) Hydrologic model scheduling method and device, storage medium and electronic equipment
CN112651096A (en) Full-life-cycle automatic adaptation method and system for parameters of coupling model in urban water flow process
CN117034508A (en) Drainage pipe network model catchment partition demarcation method based on pipe network position and earth surface flow direction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A drainage prediction method and system based on natural drainage zoning

Granted publication date: 20231114

Pledgee: Shijiazhuang Luquan Rural Commercial Bank Co.,Ltd.

Pledgor: BEIJING YUNLU TECHNOLOGY CO.,LTD.

Registration number: Y2024980015341