CN110879857B - Tunnel operation data analysis method and system - Google Patents

Tunnel operation data analysis method and system Download PDF

Info

Publication number
CN110879857B
CN110879857B CN201911183971.3A CN201911183971A CN110879857B CN 110879857 B CN110879857 B CN 110879857B CN 201911183971 A CN201911183971 A CN 201911183971A CN 110879857 B CN110879857 B CN 110879857B
Authority
CN
China
Prior art keywords
data
tunnel
coding
management
maintenance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911183971.3A
Other languages
Chinese (zh)
Other versions
CN110879857A (en
Inventor
高才驰
黄�俊
张忠宇
李志远
董飞
陈飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JSTI Group Co Ltd
Original Assignee
JSTI Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JSTI Group Co Ltd filed Critical JSTI Group Co Ltd
Priority to CN201911183971.3A priority Critical patent/CN110879857B/en
Publication of CN110879857A publication Critical patent/CN110879857A/en
Application granted granted Critical
Publication of CN110879857B publication Critical patent/CN110879857B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/90335Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to a tunnel operation data analysis method and a system, wherein the method comprises the following steps: the tunnel operation and maintenance data are obtained by segmenting the tunnel and encoding the maintenance object; configuring data importing rules for the tunnel operation data according to the mapping of Solr fields and fields in the tunnel operation data; analyzing the data use requirement, and generating a corresponding data analysis engine plug-in according to a requirement analysis result; and configuring a Solr operation framework in the web container, importing tunnel operation and maintenance data according to a data importing rule, and loading a data analysis engine plug-in for data analysis. According to the application, the tunnel management and maintenance object is encoded, the tunnel transportation data is configured and analyzed based on the Solr technology, and the Solr engine is applied, so that near-real-time inquiry and rapid fragment grouping statistics can be realized, unstructured data inquiry can be supported, and a user-defined analysis module is introduced to enhance the inquiry function under the condition of massive management and maintenance data.

Description

Tunnel operation data analysis method and system
Technical Field
The application relates to the technical field of computers, in particular to a tunnel operation data analysis method and system.
Background
In recent years, the project of the foundation construction of China is outstanding in the military, investment planning is unprecedented, and the project is stared and laid, so that the rise and development of the tunnel technology are brought with an unattainable opportunity. For the tunnel put into use, the core task is to keep the tunnel in a good running state all the time, so as to achieve the aim of 'safe and smooth'. Therefore, many tunnel management institutions actively explore new technologies to be put into operation and maintenance management, the most common is to construct a tunnel management and maintenance platform, and the huge amount of civil engineering, electromechanical equipment and other facility equipment are scientifically and finely maintained by means of the platform.
The traditional tunnel management and maintenance platform stores various data in a relational database, and is called through interfaces such as java database connection (Java Data Base Connectivity, JDBC for short) and the like, but with the continuous and rapid growth of management and maintenance data and the increase of various unstructured data, a tunnel management and maintenance information system based on the framework becomes low in efficiency, and particularly under the condition that the proportion of various unstructured data is larger and larger, the relational database is difficult to process the unstructured data efficiently, so that analysis of operation and maintenance data is very worry.
It can be seen that the existing management platform formed based on the relational database cannot process more unstructured data efficiently.
The above drawbacks are to be overcome by those skilled in the art.
Disclosure of Invention
First, the technical problem to be solved
In order to solve the problems in the prior art, the application provides a method and a system for analyzing tunneling data, which are used for solving the problem that a management and maintenance platform formed based on a relational database in the prior art cannot efficiently process more and more unstructured data.
(II) technical scheme
In order to achieve the above purpose, the main technical scheme adopted by the application comprises the following steps:
an embodiment of the present application provides a tunneling data analysis method, including:
the tunnel operation and maintenance data are obtained by segmenting the tunnel and encoding the maintenance object;
configuring data importing rules for the tunneling data according to the mapping between Solr fields and fields in the tunneling data;
analyzing the data use requirement, and generating a corresponding data analysis engine plug-in according to a requirement analysis result;
and configuring a Solr operation framework in the web container, importing the tunnel operation and maintenance data according to the data importing rules, and loading the data analysis engine plug-in for data analysis.
In one embodiment of the present application, containers for storing the tunnel operation and maintenance data are arranged linearly according to tunnels, and the tunnel operation and maintenance data are data sets in a chain shape.
In one embodiment of the present application, the obtaining the tunnel operation data by segmenting the tunnel and encoding the management object includes:
segment coding is carried out on the tunnel, classification coding is carried out on the management and maintenance objects in the tunnel, and digital coding and coding rules are established;
aiming at a management task describing management of tunnel equipment, establishing a management task data table;
and establishing a mapping relation between the management and maintenance object kernel and the fields in the management and maintenance task data table based on the Solr kernel to obtain the tunnel operation and maintenance data.
In one embodiment of the present application, the step of segment encoding the tunnel, and the step of classifying and encoding the maintenance object in the tunnel, and the step of establishing the digital encoding and the encoding rule include:
coding different tunnels to obtain tunnel codes;
segmenting the tunnel and coding according to the segments to obtain segmented codes;
deploying the management and maintenance objects in the tunnel segment according to the belonged areas and the belonged systems in a tree structure, classifying and encoding to obtain equipment codes;
coding the parts in the management object to obtain a part code;
and establishing rules for integrating tunnel coding, segment coding, equipment coding and part coding in sequence.
In one embodiment of the application, the administration task data table comprises: a core information description table, an inspection item description table, an inspection task definition table, an inspection task allocation description table, a personnel scheduling table and an inspection result record table.
In one embodiment of the present application, the data analysis engine plug-in covers functions in two classes, namely, a Standard RequestHandler and a SearchComponent, wherein the functions in the Standard RequestHandler class are used for parsing and processing query requests, and the functions in the SearchComponent class are used for executing corresponding data processing logic according to the requirement analysis result.
In one embodiment of the present application, further comprising:
and calling a data analysis result, and verifying the functional effectiveness of the data analysis engine according to the data analysis result.
Another embodiment of the present application also provides a tunneling data analysis system, including:
the data coding module is used for obtaining tunnel operation and maintenance data by segmenting a tunnel and coding a management object;
the rule configuration module is used for configuring data importing rules for the tunnel operation data according to the mapping between Solr fields and fields in the tunnel operation data;
the demand analysis module is used for analyzing the data use demand and generating a corresponding data analysis engine plug-in according to the demand analysis result;
and the data analysis module is used for configuring a Solr operation framework in the web container, importing the tunnel operation and maintenance data according to the data importing rules, loading the data analysis engine plug-in and carrying out data analysis.
In one embodiment of the application, the web container in the data analysis module employs Jetty.
In one embodiment of the present application, further comprising:
and the prototype verification module is used for calling the data analysis result and verifying the functional effectiveness of the data analysis engine according to the data analysis result.
(III) beneficial effects
The beneficial effects of the application are as follows: according to the tunnel operation data analysis method and system provided by the embodiment of the application, the tunnel operation data is configured and analyzed based on the Solr technology by encoding the tunnel management object, and the Solr engine is applied, so that near-real-time inquiry and rapid slicing grouping statistics can be realized in a massive management data environment, unstructured data inquiry can be supported, and a user-defined analysis module is introduced to enhance the inquiry function.
Drawings
FIG. 1 is a flow chart of a method for analyzing tunneling data according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a tunneling data analysis system according to an embodiment of the present application;
FIG. 3 is a diagram of a file directory after a new file is created according to an embodiment of the present application.
Detailed Description
The application will be better explained by the following detailed description of the embodiments with reference to the drawings.
All technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
In order to solve the problems of low data analysis efficiency and limited depth and breadth of a tunnel operation and maintenance application system constructed based on a traditional relational database under the conditions of continuous increase of data volume and larger unstructured data proportion, the application provides a tunnel operation and maintenance data analysis method based on Solr.
The application also provides a tunnel operation and maintenance/management data analysis method and system based on Solr, which aim to exert a larger effect, and focus on carrying out multidimensional efficient analysis on sediment data, wherein the traditional database mode is difficult to avoid the reality of insufficient comprehensive data analysis capability.
Fig. 1 is a flowchart of a tunneling data analysis method according to an embodiment of the present application, as shown in fig. 1, the method includes the following steps:
as shown in fig. 1, in step S110, tunnel operation and maintenance data is obtained by segmenting a tunnel and encoding a maintenance object;
as shown in fig. 1, in step S120, a data importing rule is configured for the tunneling data according to the mapping between the Solr field and the field in the tunneling data;
as shown in fig. 1, in step S130, the data usage requirement is analyzed, and a corresponding data analysis engine plug-in is generated according to the requirement analysis result;
as shown in fig. 1, in step S140, a Solr operation framework is configured in the web container, the tunnel operation and maintenance data is imported according to the data importing rule, and the data analysis engine plug-in is loaded for data analysis.
Correspondingly, in order to realize the method, the other embodiment of the application also provides a tunneling data analysis system based on Solr, the data analysis engine can be conveniently combined with the application system, and the engine adopts a standard HTTP protocol and can be randomly called in a module of the application system.
Fig. 2 is a schematic diagram of a tunneling data analysis system according to an embodiment of the present application, and as shown in fig. 2, the system 200 includes: a data encoding module 210, a rule configuration module 220, a requirement analysis module 230, and a data analysis module 240.
The data encoding module 210 is configured to obtain tunnel operation and maintenance data by segmenting a tunnel and encoding a management object; the rule configuration module 220 is configured to configure a data import rule for the tunneling dimension according to the mapping between the Solr field and the field in the tunneling dimension; the requirement analysis module 230 is configured to analyze the data usage requirement, and generate a corresponding data analysis engine plug-in according to the requirement analysis result; the data analysis module 240 is configured to configure a Solr operation framework in the web container, import the tunnel operation and maintenance data according to the data import rule, and load the data analysis engine plug-in for data analysis.
The tunneling data analysis method and system of the present disclosure are described in detail below in conjunction with fig. 1 and 2:
in step S110, tunnel operation and maintenance data is obtained by segmenting a tunnel and encoding a management object.
Fig. 2 is a flowchart of step S110 in fig. 1 according to an embodiment of the present application, including the following steps:
in step S201, the tunnel is segmented, the managed objects in the tunnel are classified and encoded, and digital encoding and encoding rules are established.
The method specifically comprises the following steps:
1) And encoding different tunnels to obtain tunnel codes.
The tunnel is encoded, and only the tunnel name and the serial number are reflected. A tunnel is represented by a 2-digit number as shown in table 1.
TABLE 1
Sequence number 1. 2-bit encoding Company (Corp)
1 01 Yangtze river tunnel
2 02 Xuanwu lake tunnel
In the embodiment, the tunnel names are defined for the first group, so that the management of a plurality of tunnels is facilitated; if other tunnels exist, the addition is continued, and the expansion is convenient.
2) And segmenting the tunnel and coding according to the segments to obtain segmented codes.
The tunnel was segmented, encoded, represented by a 2-bit number, as shown in table 2.
TABLE 2
Sequence number 3 rd bit Bit 4 Ring piece or pile number section Remarks
1 0 1 1 # to 10# ring piece Counting from shield origin
2 0 2 K5~K6 Segmentation by pile number intervals
The tunnel is segmented, so that the functions of large-class inquiry, statistics, retrieval and the like can be conveniently realized.
3) Deploying the management object in the tunnel segment according to the belonged area and the belonged system in a tree structure, classifying and encoding to obtain the equipment code.
And by combining the management characteristics of the tunnel industry, the facility equipment is classified into different management professions, so that data analysis is convenient from the system classification angle. The region and system classification to which the devices and facilities belong is expressed in a tree structure according to the region and system classification to which the devices and facilities belong. The facility devices within the tunnel segment are systematically classified, coded, represented by 2-bit numbers, as shown in table 3.
TABLE 3 Table 3
Sequence number 5. 6-bit encoding System classification
1 01 Ventilation system
2 02 Power supply system
3 03 Lighting system
4 04 CCTV subsystem
5 05 Water supply and drainage system
Further, the collection of electromechanical devices or civil engineering works is coded, taking into account the relatively large number, represented by a 3-digit number, as shown in table 4. This can be accurate to a specific device, such as a fan, a light fixture.
TABLE 4 Table 4
Sequence number 7. 8, 9 bit encoding Monomer equipment
1 001 Axial flow fan
2 002 Mixed flow fan
3 003 Video camera
4 004 Server device
5 005 Distribution box
4) And coding the parts in the management object to obtain a part code.
Considering that some key components are included under a certain device, if the maintenance object list is included, coding is needed. The parts of a single piece of electromechanical equipment or of a civil engineering plant are coded, given the relatively large number, with a 3-digit number, as shown in table 5.
TABLE 5
Sequence number 13. 14, 15 bit encoding Monomer equipment
1 001 Component 001
2 002 Component 002
3 003 Component 003
5) And establishing rules for integrating tunnel coding, segment coding, equipment coding and part coding in sequence.
By sequentially integrating these codes, a complete code can be formed, the complete coding system is shown in table 6:
TABLE 6
Illustrating:
01. -Changjiang tunnel
0101. 1 st to 5 th ring segment of Yangtze river tunnel
Ventilation system in 010101 and 1 # to 5# ring segment of Yangtze river tunnel
010101001 axial-flow fan group in 1 st to 5 th ring segment of Yangtze river tunnel
010101001001 first fan in axial flow fan group in 1-5 # ring segment of Yangtze river tunnel
010101001001001 impeller of first fan in axial flow fan group in 1 st-5 th ring segment of Yangtze river tunnel
In step S202, a fostering task data table is created for fostering tasks describing fostering of tunnel devices.
In this step, the management task data table includes: core information description table, inspection entry description table, inspection task definition table, inspection task allocation description table, personnel scheduling table, inspection result recording table, and the like, as shown in tables 7 to 12 below.
TABLE 7
Fields Type(s) Remarks
UUID with internal unique number Varchar(32) PRIMARY KEY
Maintenance object number obj_id Varchar(32) Coding according to the above rule
Curing object name Varchar(255) Description in Chinese
Maintenance object position Varchar(255)
……
TABLE 8
TABLE 9
Table 10
TABLE 11
Table 12
Fields Type(s) Remarks
UUID with internal unique number Varchar(32) PRIMARY KEY
Inspector USER ID Varchar(32) For primary keys in a user table
Maintenance object obj_id Varchar(32) For the primary key OBJ_ID in Table 7
Checking item ID Varchar(32) For the primary key UUID in Table 8
Inspection result Int
Examination time Datetime
……
In step S203, a mapping relationship between the kernel of the fostering object and the fields in the fostering task data table is built based on the Solr kernel, so as to obtain the tunnel operation and maintenance data.
Solr is a search engine system which encapsulates the whole index operation function, and Solr can independently run in the Servlet containers such as Jetty, tomcat and the like. Solr is configurable and extensible, and optimizes index and search performance.
In this step, after the maintenance object is encoded, the operation and maintenance form, operation and maintenance task and other forms can be designed, and the encoding is used to associate with them. Wherein the table for managing the kernel mapping of the object comprises a maintenance object table and a checking content table; the management task data table is a maintenance object checking record kernel, and the mapped table checking content, a checking result record table and a task schedule table.
In one embodiment of the application, by segmenting and coding the tunnel, each segment is regarded as a container, the civil structure in the segment and the contents needing maintenance such as electromechanical equipment are regarded as 'articles' in the container, the articles in the container are coded in combination with professional classification, on one hand, the uniqueness is shown in the article coding rule, on the other hand, sufficient expansion space is reserved, the expansion space is filled with relative static contents such as maintenance standards, maintenance methods and the like, and the association with dynamic information such as scheduling and implementation records of maintenance tasks, structural health monitoring reports and the like can be established. In this step, containers for storing the tunnel operation and maintenance data are arranged linearly in accordance with the tunnel, and the tunnel operation and maintenance data are chain-shaped data sets as a whole.
In one embodiment of the application, the data may be relational or unstructured data for the operation and maintenance, such as supporting importation of data from relational databases of mainstream Oracle, SQL Server, mysql, postgreSQL, etc.
Based on the step, the tunnel operation and maintenance object is uniformly encoded, professional classification and region division are performed on the basis, the region is bound with a two-dimensional code or RFID (the two-dimensional code or RFID can be globally unique according to the requirement), and information such as an equipment manual, operation and maintenance process data and the like is associated with the object code.
In step S120, a data importing rule is configured for the tunneling data according to the mapping between the Solr field and the field in the tunneling data.
In the step, data are organized through Solr 'cores', before the data are imported, the 'cores' are respectively built from the angles of basic information, maintenance standards, requirements, operation and maintenance records and the like for the management and maintenance data, then the relational database fields and Solr fields are mapped in configuration, and the data importing rules are configured.
In step S130, the data usage requirements are analyzed, and a corresponding data analysis engine plug-in is generated according to the requirement analysis results.
In the step, the Solr is used as the good expansibility of the full text search engine with an open source, and the Jetty is used as the independent service accessed in a manner of supporting http, webservice and the like by adopting the servlet container with the open source. Although Solr carries some conventional query functions, to meet the continuously growing data analysis requirement of tunnel management, a user is required to develop plug-ins by himself to perform function expansion on Solr according to the requirement, the characteristic of Solr is very important for constructing a data analysis engine, and the data types supported by Solr are very abundant and support dynamic fields. Solr can build a connection with a common relational database such as Oracle, mysql, DB2, automatically import data according to set rules, and edit the data by sending URL requests by a user.
And covering functions in two classes of Standard RequestHandler and SearchComponent, wherein the functions in the Standard RequestHandler class are used for analyzing and processing the query request, and the functions in the SearchComponent class are used for executing corresponding data processing logic according to the requirement analysis result.
In one embodiment of the application, the constructed data analysis engine accords with the Servlet 2.5+ specification, provides http, webservice and other modes of service, can be deployed under windows, linux and other operating systems, and the prototype system can run under windows, macos, linux environments and needs a browser with chrome70+ and IE9+ versions.
In one embodiment of the application, the Solr function is extended by a custom plug-in to meet the multi-angle analysis requirement of the tunnel maintenance data, wherein the extension refers to the process that a user writes codes by himself, the data analysis engine plug-in covers functions in two classes of Standard RequestHandler and SearchComponent and registers the plug-ins into Solr, the functions in the Standard RequestHandler class are used for analyzing and processing query requests, and the functions in the SearchComponent class are used for executing corresponding data processing logic according to the analysis result of the requirement. Since Solr is an extensible service, custom packages and classes can be added, only by adding custom search logic over default processing logic that Solr has implemented. The implementation means is to inherit the Solr basic class, rewrite the new SearchComponent class, standard RequestHandler class. The Standard RequestHandler class is used for analyzing and processing the query request, parameters transferred by url in http can be directly obtained, and then tasks are distributed according to predefined configuration, so that respective components can process the query request. The SearchComponent class is a software core data processing logic that accepts parameters from the RequestHandler process, executes corresponding data processing logic according to requirements, and common processing functions include submitting queries to Lucence and returning a list of results, finding documents similar to the results for each search result and returning the results, highlighting the location of the query term in the search result text, and so on.
In step S140, a Solr operation framework is configured in the web container, the tunnel operation and maintenance data is imported according to the data importing rules, and the data analysis engine plug-in is loaded for data analysis.
Because the data analysis engine is an independent application module, the intelligent terminal, the Web application end, the data warehouse and the analysis center are in data communication with the Web end through the 3/4G, wi-Fi network by using the HTTP or WebSocket protocol, and the server end of the Web application is in TCP communication with the data warehouse.
In one embodiment of the application, the web container in the data analysis module 140 employs Jetty. And configuring a Solr operation framework in Jetty, creating a plurality of cores according to Solr field type requirements by using a management information data chain, configuring external data import and type conversion rules, and writing Solr custom plug-ins according to tunnel management requirements to realize specific functions. The data analysis application prototype is constructed by the Spring Boot, the VUE and the Element-UI technology and is published in the Jetty, and the prototype verification system proves the feasibility and the practicability of the design of the data analysis engine through http call. Jetty publishes the Solr support library in a servlet mode, namely, using jdk1.8, jetty9 and Solr 7.X to publish web applications according to Jetty operation guidelines and configure the Solr runtime library in the server.
In one embodiment of the application, a Spring Boot technology is adopted as a background frame, a Vue+element UI is used as a web system constructed by a front-end frame, and the system is mainly used for packaging search service as an interface for external calling, providing interface display interface calling effect and packaging the service as an interface, so that not only can data processing logic be protected, but also service use is simplified.
In this embodiment, the analysis method is described by taking tunnel operation (such as management and maintenance) data as an example, and other information systems similar to tunnel management and maintenance are used in later stage, so that the key point of later use is gradually changed into analysis application to stock data, and therefore, a data analysis engine with powerful analysis function and good expansibility is required, and can be conveniently combined with an application system. What is needed to follow is to summarize and analyze the demands of the users, write the plug-in analysis engine plug-ins with various functions to meet the demands of the business on the data analysis.
In one embodiment of the application, the data analysis further comprises:
and calling a data analysis result, and verifying the functional effectiveness of the data analysis engine according to the data analysis result.
The steps can be realized by using a model verification module, and specifically: the system is realized by using a system which is a B/S structure as a whole, adopts Spring Boot as a back-end technical framework, and Vue+element UI as a front-end framework, and the system is mainly used for verifying the functional effectiveness of a data analysis engine, and can package data analysis service as an interface for external calling to provide an interface display calling effect.
For example, in a specific application scenario, the process of implementing the data search engine and the data analysis engine is as follows:
1) The Solr running environment is configured in Jetty, JDK1.8, jetty9 and Solr7 are downloaded on the respective official networks, JDk is installed under the D packing directory (e.g.: jdk1.8.0_211, and adding the PATH into the system PATH variable), jetty decompresses to the D packing list (e.g.: and D, jetty9.4.18, and modifying the port in the start.ini to 8080), and newly building a batch file jetty.bat in the jetty catalog, wherein the content is%java_home% \bin\java-jar start.jar.
2) Newly building a folder named as a software under a webapp directory in jetty, copying all files under the downloaded software server/software/webapp directory in the software, and then entering the web-inf directory to newly build a folder named as a class. FIG. 3 is a diagram of a file directory after a new file is created according to an embodiment of the present application, and the directory structure shown in FIG. 3 should be formed up to this step.
3) All jar in server\lib in downloaded solr7 are copied to web-inf\lib in the last step in a single file mode.
4) In the file directory diagram shown in fig. 3, newly built a solr folder with webapps level. Copying the server/xml file in the server 7 to the file, and then newly building a file folder named tunnel and two subfolders conf and data. All files in server\solr\configsets\defaults\conf in solr7 are copied to conf.
5) Starting the jetty.bat file under the 9.4.18 directory of jetty, and opening the browser inputhttp:// localhost:8080/solrAt this time, an interface where the operation is successful should be seen, and the operation result is displayed on the section.
In summary, the data analysis engine can be conveniently combined with the application system, and the engine adopts the standard HTTP protocol and can be randomly invoked in the module of the application system. Compared with a database storage mode, solr has the advantages of better text retrieval performance, higher efficiency, flexible cache function, strong custom plug-in function, complete stop of SQL injection loopholes based on query of Solr, and the like. The Solr technology is applied in the patent, so that the method can play a great role in the environment-friendly tunnel management and maintenance data analysis scene. After the green tunnel management and maintenance system is operated for a long time, massive data are generated, and the traditional database model faces huge performance load. By using the Solr engine, under the sea-level data management environment, the method not only can realize near-real-time inquiry and quick fragment grouping statistics, but also can support unstructured data inquiry and introduce a custom analysis module to enhance the inquiry function. By combining with the specific task of green tunnel management and maintenance, solr analysis plug-ins suitable for green tunnel management and maintenance service can be registered in the Solr engine, and analysis statistical results in a mode more suitable for actual service are displayed. The plug-in can greatly improve the data analysis efficiency and the data analysis capability of the application system.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (7)

1. A method of tunneling data analysis, comprising:
the tunnel operation and maintenance data are obtained by segmenting the tunnel and encoding the maintenance object;
configuring data importing rules for the tunneling data according to the mapping between Solr fields and fields in the tunneling data;
analyzing the data use requirement, and generating a corresponding data analysis engine plug-in according to a requirement analysis result;
a Solr operation frame is configured in the web container, the tunnel operation and maintenance data are imported according to the data importing rules, and the data analysis engine plug-in is loaded for data analysis;
the method for obtaining the tunnel transport data by segmenting the tunnel and encoding the management object comprises the following steps:
segment coding is carried out on the tunnel, classification coding is carried out on the management and maintenance objects in the tunnel, and digital coding and coding rules are established;
aiming at a management task describing management of tunnel equipment, establishing a management task data table;
establishing a mapping relation between a management maintenance object kernel and fields in the management maintenance task data table based on Solr to obtain the tunnel operation and maintenance data;
the step of carrying out sectional coding on the tunnel, the step of carrying out classified coding on the management and maintenance objects in the tunnel, and the step of establishing digital coding and coding rules comprises the following steps:
coding different tunnels to obtain tunnel codes;
segmenting the tunnel and coding according to the segments to obtain segmented codes;
deploying the management and maintenance objects in the tunnel segment according to the belonged areas and the belonged systems in a tree structure, classifying and encoding to obtain equipment and facility codes;
coding the parts in the management object to obtain a part code;
establishing rules for integrating tunnel coding, segment coding, equipment coding and part coding in sequence;
the management task data table comprises: a core information description table, an inspection item description table, an inspection task definition table, an inspection task allocation description table, a personnel scheduling table and an inspection result record table.
2. The tunneling data analysis method according to claim 1, wherein containers for storing the tunneling data are arranged linearly in a tunnel, and the tunneling data are chain-shaped data sets.
3. The tunneling data analysis method of claim 1, wherein the data analysis engine plug-in overlays functions in both the Standard RequestHandler and SearchComponent classes, wherein functions in the Standard RequestHandler class are used to parse query requests and functions in the SearchComponent class are used to execute corresponding data processing logic based on the results of the demand analysis.
4. The tunneling data analysis method of claim 1, further comprising:
and calling a data analysis result, and verifying the functional effectiveness of the data analysis engine according to the data analysis result.
5. A tunneling data analysis system, comprising:
the data coding module is used for obtaining tunnel operation and maintenance data by segmenting a tunnel and coding a management object;
the rule configuration module is used for configuring data importing rules for the tunnel operation data according to the mapping between Solr fields and fields in the tunnel operation data;
the demand analysis module is used for analyzing the data use demand and generating a corresponding data analysis engine plug-in according to the demand analysis result;
the data analysis module is used for configuring a Solr operation frame in the web container, importing the tunnel operation and maintenance data according to the data importing rules, loading the data analysis engine plug-in, and carrying out data analysis;
the data encoding module is specifically configured to: segment coding is carried out on the tunnel, classification coding is carried out on the management and maintenance objects in the tunnel, and digital coding and coding rules are established; aiming at a management task describing management of tunnel equipment, establishing a management task data table; establishing a mapping relation between a management maintenance object kernel and fields in the management maintenance task data table based on Solr to obtain the tunnel operation and maintenance data;
the data encoding module is specifically configured to: coding different tunnels to obtain tunnel codes; segmenting the tunnel and coding according to the segments to obtain segmented codes; deploying the management and maintenance objects in the tunnel segment according to the belonged areas and the belonged systems in a tree structure, classifying and encoding to obtain equipment and facility codes; coding the parts in the management object to obtain a part code; establishing rules for integrating tunnel coding, segment coding, equipment coding and part coding in sequence;
the management task data table comprises: a core information description table, an inspection item description table, an inspection task definition table, an inspection task allocation description table, a personnel scheduling table and an inspection result record table.
6. The tunneling data analysis system of claim 5, wherein the web container in the data analysis module employs Jetty.
7. The tunneling data analysis system of claim 5, further comprising:
and the prototype verification module is used for calling the data analysis result and verifying the functional effectiveness of the data analysis engine according to the data analysis result.
CN201911183971.3A 2019-11-27 2019-11-27 Tunnel operation data analysis method and system Active CN110879857B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911183971.3A CN110879857B (en) 2019-11-27 2019-11-27 Tunnel operation data analysis method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911183971.3A CN110879857B (en) 2019-11-27 2019-11-27 Tunnel operation data analysis method and system

Publications (2)

Publication Number Publication Date
CN110879857A CN110879857A (en) 2020-03-13
CN110879857B true CN110879857B (en) 2023-11-07

Family

ID=69730367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911183971.3A Active CN110879857B (en) 2019-11-27 2019-11-27 Tunnel operation data analysis method and system

Country Status (1)

Country Link
CN (1) CN110879857B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113161009A (en) * 2021-04-22 2021-07-23 山东健康医疗大数据有限公司 Method for real-time monitoring of novel coronavirus nucleic acid detection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138592A (en) * 2015-07-31 2015-12-09 武汉虹信技术服务有限责任公司 Distributed framework-based log data storing and retrieving method
CN106126729A (en) * 2016-07-01 2016-11-16 交通运输部路网监测与应急处置中心 A kind of electronic chart kilometer stone data acquisition and update method
CN109697200A (en) * 2018-12-18 2019-04-30 厦门商集网络科技有限责任公司 A kind of HBase secondary index method and apparatus based on Solr
CN110134728A (en) * 2019-05-09 2019-08-16 浪潮软件集团有限公司 It is a kind of to provide the method and system of map space data based on full-text search

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138592A (en) * 2015-07-31 2015-12-09 武汉虹信技术服务有限责任公司 Distributed framework-based log data storing and retrieving method
CN106126729A (en) * 2016-07-01 2016-11-16 交通运输部路网监测与应急处置中心 A kind of electronic chart kilometer stone data acquisition and update method
CN109697200A (en) * 2018-12-18 2019-04-30 厦门商集网络科技有限责任公司 A kind of HBase secondary index method and apparatus based on Solr
CN110134728A (en) * 2019-05-09 2019-08-16 浪潮软件集团有限公司 It is a kind of to provide the method and system of map space data based on full-text search

Also Published As

Publication number Publication date
CN110879857A (en) 2020-03-13

Similar Documents

Publication Publication Date Title
US11755628B2 (en) Data relationships storage platform
US11132384B2 (en) Generating a multi-column index for relational databases by interleaving data bits for selectivity
US11789978B2 (en) System and method for load, aggregate and batch calculation in one scan in a multidimensional database environment
US20230084389A1 (en) System and method for providing bottom-up aggregation in a multidimensional database environment
US20190102447A1 (en) System and method for metadata sandboxing and what-if analysis in a multidimensional database environment
US10540383B2 (en) Automatic ontology generation
US20210056102A1 (en) System and method for data organization, optimization and analytics
CN103440273B (en) A kind of data cross-platform migration method and device
CN110168522B (en) Maintaining data lineage to detect data event
US11921750B2 (en) Database systems and applications for assigning records to chunks of a partition in a non-relational database system with auto-balancing
US20150220527A1 (en) Database table format conversion based on user data access patterns in a networked computing environment
US9201700B2 (en) Provisioning computer resources on a network
US20230359627A1 (en) Sharing compiled code for executing queries across query engines
CN110737729A (en) Engineering map data information management method based on knowledge map concept and technology
CN115794839B (en) Data collection method based on Php+Mysql system, computer equipment and storage medium
CN110879857B (en) Tunnel operation data analysis method and system
CN106599241A (en) Big data visual management method for GIS software
CN112182138A (en) Catalog making method and device
Sharma et al. Modeling ETL Process for data warehouse: an exploratory study
US8745008B2 (en) Propagating per-custodian preservation and collection requests between ediscovery management applications and content archives
Bok et al. Provenance compression scheme based on graph patterns for large RDF documents
CN106156904B (en) Cross-platform virtual asset tracing method based on eID
CN116719799A (en) Environment-friendly data management method, device, computer equipment and storage medium
Farooq The data warehouse virtualization framework for operational business intelligence
Wang et al. A dynamic data integration model based on SOA

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant