CN113608981A - Time sequence database testing method and device, computer equipment and storage medium - Google Patents

Time sequence database testing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113608981A
CN113608981A CN202110851588.1A CN202110851588A CN113608981A CN 113608981 A CN113608981 A CN 113608981A CN 202110851588 A CN202110851588 A CN 202110851588A CN 113608981 A CN113608981 A CN 113608981A
Authority
CN
China
Prior art keywords
time sequence
sequence database
data
query
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110851588.1A
Other languages
Chinese (zh)
Other versions
CN113608981B (en
Inventor
寇伟
赵宏
陈小梦
宁德刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Envision Innovation Intelligent Technology Co Ltd
Envision Digital International Pte Ltd
Original Assignee
Shanghai Envision Innovation Intelligent Technology Co Ltd
Envision Digital International Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Envision Innovation Intelligent Technology Co Ltd, Envision Digital International Pte Ltd filed Critical Shanghai Envision Innovation Intelligent Technology Co Ltd
Priority to CN202110851588.1A priority Critical patent/CN113608981B/en
Publication of CN113608981A publication Critical patent/CN113608981A/en
Application granted granted Critical
Publication of CN113608981B publication Critical patent/CN113608981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to a time sequence database testing method, a time sequence database testing device, computer equipment and a storage medium, in particular to the field of data processing. The method comprises the following steps: receiving a test instruction of a time sequence database, wherein the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time; acquiring a target format of the time sequence database; and testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database, wherein the performance parameters comprise at least one of query performance parameters and write performance parameters. According to the scheme, after the test designation of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, the query performance parameters and the write performance parameters of the time sequence database are obtained, namely the influence of the data format of the time sequence database on query and write is considered at the same time, and the test accuracy is improved.

Description

Time sequence database testing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of data processing, and in particular, to a method and an apparatus for testing a time series database, a computer device, and a storage medium.
Background
With the rapid development of the internet of things technology, the problem of internet of things big data begins to be highlighted, and the massive characteristics of the internet of things data bring great challenges to data quality control, data storage, data compression, data integration, data fusion and data query, wherein evaluation on the performance of a time sequence database is the basis of the internet of things big data storage.
In the related art, in the test of the time sequence database in the scene of the internet of things, a test tool is generally used for performing a write-in read test on the time sequence database, and the performance of the time sequence database is evaluated through the time consumed in the write-in and read processes of the time sequence database.
However, in the related art, when the test tool performs the write-read test on the time sequence database, the performance result obtained by the test is greatly different from the actual service processing capability.
Disclosure of Invention
The embodiment of the application provides a time sequence database testing method, a time sequence database testing device, computer equipment and a storage medium, which can test the time sequence database according to the format of the time sequence database, so that the testing accuracy is improved, and the technical scheme is as follows:
in one aspect, a time series database testing method is provided, and the method includes:
receiving a test instruction for a time sequence database, wherein the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time;
acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database; the performance parameters comprise at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when query testing is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
In yet another aspect, a time series database testing apparatus is provided, the apparatus comprising:
the test instruction receiving module is used for receiving a test instruction of a time sequence database, and the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time;
the target format acquisition module is used for acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
the performance parameter acquisition module is used for testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database; the performance parameters comprise at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when query testing is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
In a possible implementation manner, the write performance parameter includes at least one of storage time, number of errors, request processing times per second, memory occupancy, and processor occupancy;
the query performance parameters include at least one of a number of errors, a response time, a number of request processes per second, a memory occupancy, and a processor occupancy.
In a possible implementation manner, the performance parameter obtaining module is configured to perform query testing on the target data in the target format in the time series database to obtain the query performance parameter.
In one possible implementation, the data table structure includes at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the query performance parameter obtaining module comprises:
a query identifier obtaining unit, configured to generate a query identifier according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type;
and the query performance parameter acquisition unit is used for querying the target data in the target format in the time sequence database according to the query identifier to acquire the query performance parameters.
In a possible implementation manner, the query identifier obtaining unit is configured to randomly select one data attribute from the at least one data attribute type;
and generating the query identifier according to the data attribute selected from the at least one data attribute type.
In a possible implementation manner, the performance parameter obtaining module is configured to perform a write test on the target time sequence database according to the target data in the target format, and obtain a write performance parameter of the time sequence database.
In one possible implementation, the apparatus further includes:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
In still another aspect, a computer device is provided, which includes a processor and a memory, where at least one computer instruction is stored in the memory, and the at least one computer instruction is loaded and executed by the processor to implement the time series database testing method.
In yet another aspect, a computer-readable storage medium is provided, in which at least one computer instruction is stored, and the at least one computer instruction is loaded and executed by a processor to implement the above time-series database testing method.
In yet another aspect, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to execute the time-series database testing method.
The technical scheme provided by the application can comprise the following beneficial effects:
after receiving a test instruction for instructing to perform writing test and reading test on the time sequence database, acquiring a format corresponding to the time sequence database, and testing the time sequence database according to data in a target format to acquire writing performance parameters and query performance parameters respectively indicating writing performance and query performance of the time sequence database. By the scheme, after the test designation of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, the query performance parameters and the write performance parameters of the time sequence database are obtained, namely the influence of the data format of the time sequence database on query and write is considered at the same time, and the test accuracy is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a block diagram illustrating a sequential database test system in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of testing a time series database in accordance with an exemplary embodiment;
FIG. 3 is a flow chart of a method of sequential database testing provided in accordance with an exemplary embodiment;
FIG. 4 is a diagram illustrating a data file format according to an embodiment of the present application;
FIG. 5 is a flow chart illustrating a method for testing a time-series database according to an embodiment of the present disclosure;
FIG. 6 is a flow chart illustrating a write test method according to an embodiment of the present disclosure;
FIG. 7 is a flow chart illustrating a query testing method according to an embodiment of the present disclosure;
FIG. 8 is a block diagram of a write test method flow diagram in accordance with an exemplary embodiment;
FIG. 9 is a flow diagram illustrating a query testing method in accordance with an exemplary embodiment;
FIG. 10 is a block diagram illustrating the structure of a sequential database testing apparatus in accordance with an exemplary embodiment;
FIG. 11 is a block diagram illustrating a computer device in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Before describing the various embodiments shown herein, several concepts related to the present application will be described:
1) internet of Things (The Internet of Things, IOT)
The internet of things is that any object or process needing monitoring, connection and interaction is collected in real time through various devices and technologies such as various information sensors, radio frequency identification technologies, global positioning systems, infrared sensors, laser scanners and the like, various required information such as sound, light, heat, electricity, mechanics, chemistry, biology, positions and the like is collected, ubiquitous connection of objects and objects, and ubiquitous connection of objects and people are realized through various possible network accesses, and intelligent sensing, identification and management of the objects and the processes are realized. The internet of things is an information bearer based on the internet, a traditional telecommunication network and the like, and all common physical objects which can be independently addressed form an interconnected network.
2) Database (Data Compression, DC)
A database is a repository where data is stored. The storage space is large, and millions, millions and hundreds of millions of data can be stored. However, the database does not store data randomly, and has certain rules, otherwise, the query efficiency is low. The world today is an internet world that is full of data, which is flooded with large amounts of data. I.e. the internet world is the data world. The sources of data are many, such as travel records, consumption records, web pages viewed, messages sent, and so forth. In addition to text type data, images, music, and sounds are data. A database is a computer software system that stores and manages data in a data structure. The database is an entity, which is a "warehouse" capable of reasonably keeping data, in which a user stores transaction data to be managed, and the two concepts of "data" and "library" are combined into the database. Databases are new methods and techniques for data management that enable more appropriate organization of data, more convenient maintenance of data, tighter control of data, and more efficient use of data.
FIG. 1 is a block diagram illustrating a sequential database testing system in accordance with an exemplary embodiment. The system comprises: data storage device 120, and test device 140.
The data storage device 120 may include a data storage module (not shown in the figure), and the data in the time-series database may be stored in the data storage module in advance; the data storage device 120 may be directly connected to a sensor, which may be one sensor or several sensors, and the sensor generates corresponding time series data according to the change of the external environment, and sends the time series data to the data storage device for storage.
The test device 140 may include a data transmission module and a data processing module. The data transmission module is used for receiving data after the query request is sent to the time sequence database; or the data transmission module is also used for sending the data processed by the data processing module to the time sequence database. The data processing module can process the data to be transmitted into data in a format corresponding to the time sequence database so as to transmit the data to the time sequence database of the data storage device.
Optionally, the data storage device 120 may be a server, or may include a plurality of servers, or a distributed computer cluster formed by a plurality of servers, or a virtualization platform, or a cloud computing service center, and the like, which is not limited in this application.
The data storage device 120 is coupled to the test equipment 140 via a communications network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the wireless network or wired network described above uses standard communication techniques and/or protocols. The Network is typically the Internet, but may be any Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks. In some embodiments, data exchanged over a network is represented using techniques and/or formats including Hypertext Mark-up Language (HTML), Extensible Markup Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), Internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
Please refer to fig. 2, which is a flowchart illustrating a time series database testing method according to an exemplary embodiment. The method may be performed by a computer device, which may be the testing device 140 in the embodiment shown in fig. 1. As shown in fig. 2, the flow of the time series database testing method may include the following steps:
step 21, receiving a test instruction of a time sequence database; the test instruction is used for indicating the performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene.
In one possible implementation manner, the time sequence database may receive time sequence data with a time tag generated by the sensor device in the scene of the internet of things.
The time-series data are mainly data collected or generated by a sensor device and other real-time detection, inspection and analysis devices. Generally, the time sequence data of the sensor device is generated quickly, each piece of data is required to correspond to a unique timestamp, and the data quantity of the test point is large, so that the data stored in the time sequence database is more likely to be the time sequence database with larger data quantity compared with a relational time sequence database.
Step 22, obtaining a target format of the time sequence database; the target format is determined from a data table structure of the time series database.
In one possible implementation, the target format is determined from a data table structure constructed by the time series database.
The same time sequence database can construct various different table structures according to the basic attributes of the time sequence database, and in the time sequence database taking the Influxdb time sequence database as an example, a narrow table (high table) and a wide table can be constructed according to the basic attributes of the time sequence database, wherein the service performances corresponding to the time sequence databases with different data table structures can also be different.
For example, taking the Influxdb time series database as an example, the basic composition concept of the time series database needs to be introduced.
For a Time series database (Time series database), there is a column named Time, in which a Coordinated Universal Time (UTC) timestamp is stored, and in the column, Time information corresponding to each Time series data is recorded. For a time series database, where there is a column of data called fields (fields), the infixtb field consists of a field key and a field value. Wherein, field keys are string types for storing metadata, and field value is the specific value of the column data in the field. In influxdb, a field must exist, but the field is not indexed. If a field is used as a query condition, all field values that meet the query condition are scanned. The tag is optional and consists of a tag key and a tag value, the tag value can only be of a string type, and the feature with high use frequency can be set as the tag when the time sequence database is queried, so that the query efficiency is improved.
For the time sequence database, a monitoring index (metric) is also provided for indicating different types of time sequence data, all the monitoring indexes are the same from the storage aspect, but the metrics have some slight differences under different scenes, for example, when the obtained time sequence data is the load condition of the current system, the number of samples returned by the index is changed along with the change of time; when the acquired time series data is the accumulated use time of the processor, the sample value returned by the index is continuously increased along with the change of time.
Step 23, testing the time sequence database according to the target data in the target format to obtain the performance parameters of the time sequence database; the performance parameter comprises at least one of a query performance parameter and a write performance parameter; the query performance parameter is used for indicating the performance of the time sequence database when query test is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
In a possible implementation manner, when the time sequence database is tested, the time sequence database can be selected to be subjected to an inquiry performance test or written into a performance test according to a test instruction; when the test instruction is a query instruction, query testing can be carried out on the time sequence database to obtain query performance parameters; when the test instruction is a write-in instruction, performing write-in test on the time sequence database to obtain write-in performance parameters; and simultaneously, a plurality of threads are set, and query test and write test are carried out on the time sequence database to obtain query performance parameters and write performance parameters of the time sequence database.
In a possible implementation manner, during the process of testing the time sequence database, performance monitoring may be performed on the data storage device where the time sequence database is located, so as to obtain a memory occupancy rate and a CPU (Central Processing Unit) occupancy rate of the data storage device during the test process.
In a possible implementation manner, the memory occupancy rate and the CPU occupancy rate of the data storage device may be directly obtained through the data storage device of the time sequence database; or the memory occupancy rate and the CPU occupancy rate of the data storage device may be acquired by sending an acquisition request to the data storage device according to the test device.
In summary, in the solution shown in the embodiment of the present application, after receiving a test instruction indicating to perform a write test and a read test on a time sequence database, a format corresponding to the time sequence database is obtained, and the time sequence database is tested according to data in a target format to obtain a write performance parameter and a query performance parameter respectively indicating write performance and query performance of the time sequence database. By the scheme, after the test designation of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, the query performance parameters and the write performance parameters of the time sequence database are obtained, namely the influence of the data format of the time sequence database on query and write is considered at the same time, and the test accuracy is improved.
Please refer to fig. 3, which is a flowchart illustrating a method for testing a time series database according to an exemplary embodiment. The method may be performed by a computer device, which may be the testing device 140 in the embodiment shown in fig. 1. As shown in fig. 3, the time series database testing method may include the steps of:
step 301, receiving a test instruction for a time sequence database.
The test instruction is used for indicating performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene.
In one possible implementation, the test instruction is sent to the test device by the user through other computer equipment; alternatively, the test instruction is generated by the user through the test device.
In one possible implementation, the test instruction is a query test instruction, and the query test instruction is used for instructing to perform a query performance test on the time-series database.
In one possible implementation, in response to receiving the test instruction, performance monitoring is performed on the device corresponding to the time sequence database.
The device corresponding to the time sequence database may be a device for storing the time sequence database.
In one possible implementation, the performance monitor is used to obtain at least one of a number of processing requests per second, a response time, and a number of errors in the query performance test of the time series database.
In a possible implementation manner, the performance monitoring is further configured to obtain a change of the memory occupancy rate and the processor occupancy rate of the device corresponding to the time sequence database with time.
In a possible implementation manner, because the data volume of the time sequence database is large, the time sequence database may be a distributed time sequence database, which exists on a distributed device cluster formed by multiple devices through a network, and therefore, after receiving a test instruction for the time sequence database, performance monitoring may be performed on all or part of the devices in the distributed device cluster.
Step 302, obtain the target format of the time sequence database.
Wherein the target format is determined according to a data table structure of the time series database.
Before testing the time sequence database, a data format corresponding to the time sequence database, namely a structure of a data table, needs to be acquired, and a query rule of the time sequence database is determined according to the structure of the data table so as to perform query testing on the time sequence database.
In one possible implementation manner, a data file of the time sequence database is read, and a target format of the time sequence database is obtained according to a data format of the data file.
Step 303, testing the time sequence database according to the target data in the target format, and obtaining the performance parameters of the time sequence database.
In a possible implementation manner, the target data in the target format in the time sequence database is subjected to query test, and the query performance parameter is obtained.
In one possible implementation, a query identifier is generated according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type; and querying the target data in the target format in the time sequence database according to the query identifier to obtain the query performance parameter.
In one possible implementation, in the at least one data attribute type, one data attribute is randomly selected each; the query identification is generated based on a data attribute selected from at least one data attribute type.
Wherein, the data table structure comprises at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used to classify the target data.
The data table structure is the storage format of the data file corresponding to the time sequence database. Please refer to fig. 4, which illustrates a data file format diagram according to an embodiment of the present application. As shown in fig. 4, the data file 401 is composed of a metric 402(metric), a time 403(time), a tag 404(tag), and a field 405 (field). The data attribute type is at least one of the metric (metric), time (time), tag (label), and field (field), that is, the data table structure and the data structure type are composed of the data attribute types, each data attribute type at least includes a data attribute corresponding to the data attribute type, for example, when the data attribute type is tag, at least one data attribute of tag1 in the time sequence database corresponds to the tag, and in the time sequence database, the data in the time sequence database can be classified and stored according to the data attribute.
A query identifier (i.e., a query statement) may be generated according to the data attribute selected from the data attribute types, so as to perform a query operation on the target data stored in the time series database. Taking the query statements "select" age "," value "from" user "of the infilxdb time-series database as an example, where" user "is a data table in the time-series database that needs to be queried, the statements are" age "and" v alue "queried in the table" user ", where" age "is a tag.
In a possible implementation manner, in the at least one data attribute type, one data attribute is randomly selected, and the query identifier is generated according to an attribute type corresponding to the at least one data attribute type.
And the attribute type corresponding to the at least one data attribute type is selected randomly in the at least one data attribute type.
After a table structure corresponding to a time sequence database is obtained according to a data file in the time sequence database, random selection is performed according to at least one of measurement (metric), time (time), tag (label) and field (field) corresponding to the table structure, corresponding attribute types are obtained, and the randomly selected attribute types are spliced to obtain a query condition corresponding to a query test. For example, when selecting the metric corresponding to the table structure and randomly splicing the tags, the metric type corresponding to the table structure may be "a" or "B", the tag type may be "tag 1", "tag 2" or "tag 3", the query condition corresponding to the metric may be "metric is a, ta g is tag 1", "metric is a, tag is tag 2", "metric is a, tag is tag 3", "metr ic is B, tag is tag 1", "metric is B, tag is tag 2", "metric is C, and tag is tag 3", and when the more data attribute types are selected and the greater the number of data attributes corresponding to the data attribute types is, the more query conditions may be generated and the corresponding data table may have a plurality of tags at the same time, for example, for a wide table, each data corresponding tag may be stored in a table with different contents, and all tags are represented in a single table, the query efficiency is high.
In addition, in the actual test, when the time sequence database is queried according to the query identifier and the path is realized on the equipment, the time sequence database receives the query request, the data corresponding to the query request is firstly transmitted from the time sequence database to the cache of the data storage equipment, the data storage equipment transmits the data in the cache to the test equipment so as to realize the data feedback of the query request, therefore, in the actual test, when the time sequence database is tested by a common test tool, the query test is carried out according to the preset query rule, the situation that repeated requests are carried out on certain data exists, the data corresponding to the query request may also exist in the cache of the data storage equipment at the moment, and at the moment, the data storage equipment can preferentially transmit the equipment stored in the cache to the test equipment instead of querying the data from the time sequence database, the query result may deviate from the real performance of the time-series database.
By randomly splicing the data attributes, the obtained query identifier (namely query statement) can be prevented from being sent as far as possible, and the influence of cache on the real performance test of the time sequence database is reduced in the time sequence database for storing mass data.
In a possible implementation manner, according to the query identifier, the target data in the target format in the time series database is queried to obtain the query performance parameter.
And according to the query statement corresponding to the query identifier, acquiring data corresponding to the query statement in the time sequence database, and feeding back the performance of the time sequence database corresponding to the query process.
In one possible implementation, the query performance parameter includes at least one of a number of errors, a response time, a number of request processing times per second, a memory occupancy, and a processor occupancy.
The number of errors is the number of times of the error condition of the query in the query test process; the response time is the time for the system to respond to the request, and in the query test, the response time is the time required for the system to query the corresponding data; RPS (Requests Per Second, number of times of processing Requests Per Second), which is the number of Requests that the system can process received Per Second; the memory occupancy and processor occupancy are used to indicate the load requirements of the timing database on the data storage device on which the timing database is stored.
In a possible implementation manner, according to the target data in the target format, a write test is performed on the target time sequence database, and write performance parameters of the time sequence database are obtained.
In one possible implementation, the target data in the target format may be a data file generated by existing data according to the target format of the time-series database.
Namely, the test equipment firstly obtains the target format of the data file of the time sequence database, and converts the existing data table into the target data of the target format according to the target format.
In another possible implementation manner, the target data in the target format may be a simulation data file generated according to the target format of the time-series database.
The generation rule of the simulation data file may be random generation, that is, the corresponding data attribute is randomly generated according to the data attribute type corresponding to the target format, so as to form a target data file conforming to the target format.
For example, when the time-series database is openTSDB, the write statement may be "put cpu 14340674670000000000.64 host ═ serverA region ═ us _ west", that is, the write metric is cpu, the timestamp is 1434067467000000000, the value is 0.64, the tag corresponding to host is serverA, and the tag corresponding to region is data of us _ west.
When the time-series database is infiluxdb, the written statement may be "INSERT cpu, host ═ server a, and region ═ us _ west value ═ 0.641434067467000000000", which has the same meaning as the statement of openTSDB, that is, the written metric is cpu, the timestamp is 1434067467000000000, the value is 0.64, tag corresponding to host is server a, and tag corresponding to region is data of us _ west.
In a possible implementation manner, the write performance parameter includes at least one of storage time, number of errors, request processing times per second, memory occupancy, and processor occupancy; wherein the storage elapsed time is used to indicate the time elapsed to write the target data to the time series database.
In one possible implementation manner, in response to receiving a test instruction, performance monitoring is performed on the data storage device where the time sequence database is located, and query test and write test are performed on the time sequence database at the same time to obtain query performance parameters and write performance parameters of the time sequence database.
That is, when the time sequence database is tested, the query performance parameter and the write performance parameter of the time sequence database during the query test and the write test, and the CPU occupancy rate and the memory occupancy rate corresponding to the data storage device where the time sequence database is located can be obtained at the same time. In the actual application of the time sequence database, in the process of inquiring the time sequence database, the equipment such as a sensor and the like may still perform writing operation on the time sequence database, at the moment, the inquiry test and the writing test are simultaneously performed on the time sequence database, and the performance parameters of the time sequence database at the moment are obtained, so that the condition of the time sequence database in the actual application can be simulated as much as possible.
In one possible implementation, when the time sequence database is in a test state, the processor occupancy rate and the memory occupancy rate of the time sequence database are obtained.
In one possible implementation, the performance score of the time series database is obtained according to the query performance parameter and the write performance parameter.
After the query test and the write-in test are completed, the performance of the time sequence database can be evaluated according to the acquired query performance parameters and write-in performance parameters. For example, when the time sequence database needs a better query function, the query performance parameter can be set to be a higher weight, and the higher weight and the write performance parameter are comprehensively considered, so that the performance score of the time sequence database is finally obtained.
Please refer to fig. 5, which illustrates a flowchart of a time-series database testing method according to an embodiment of the present application. As shown in FIG. 5, when a test instruction 500 is received, a write test 501 is initiated, as well as a query test 502. First, for the write test 501, target data may be generated 503 according to existing data or data generated by simulation, a data file 504 corresponding to the time-series database is generated, and the data file is transmitted to the time-series database through an API (Application Programming Interface) Interface or an HTTP (hypertext Transfer Protocol) Interface 505, so as to obtain a write performance parameter 506, where the write performance parameter may be time-consuming to store.
Moreover, for the query test 502, according to the target format of the data file 504, the fields (e.g., tag and field) corresponding to the target format are spliced, and according to the query statement obtained after splicing, i.e., the query request 507, a request is sent to the time series database, the data file corresponding to the query statement is obtained, and the corresponding query performance parameters 508 are obtained, where the query performance parameters may include RPS, reaction time, and error number.
During the test, the device corresponding to the time sequence database may be monitored for the performance in the whole process, and the CPU occupancy rate and the memory occupancy rate 509 of the device corresponding to the time sequence database during the test process are obtained, so as to determine the load requirement capability of the time sequence database on the device during the operation process. Based on the write performance parameters 506, query performance parameters 508, and CPU and memory occupancy 509, the performance results 510 of the timing database may be obtained synthetically.
Table 1 is a narrow table of sensor data according to an embodiment of the present application.
TABLE 1
Temperature of 2020 Tag1 Value=1
Temperature of 2020 Tag2 Value=2
Wind speed 2020 Tag1 Value=3
The temperature corresponds to the wind speed, the 2020 corresponds to the time, Tag1 and Tag2 are Tag labels, and Value is a field in a time sequence database table structure.
TABLE 2
Temperature of 2020 Tag1=1,Tag2=2,Tag3=3…
Wind speed 2020 Tag1=1,Tag2=2,Tag3=3…
Table 2 is a sensor data width table according to an embodiment of the present application. As shown in table 2, tag labels corresponding to each piece of data can all be shown in the table, different contents are stored in one table, the data can be accurately queried according to the corresponding tags, the query efficiency is high, but when the width of the wide table reaches a certain number, the value becomes very large, the writing is greatly affected, and the value is traversed once again every time data is written, so that the writing performance is affected.
Therefore, different table structures have different influences on writing and reading performances, and the performance of the time sequence database can be more accurately evaluated by performing writing and reading tests on the time sequence database.
Please refer to fig. 6, which illustrates a flowchart of a write test method according to an embodiment of the present application. As shown in fig. 6, in S601, a data generation rule is configured, for example, a data file is randomly generated according to a target format of a time-series database. S602, a data file is generated according to the generation rule of the data. S603, configuring the writing rule of the data file, namely configuring the writing statement. S604, executing the writing task according to the writing rule, and writing the generated data file into the time sequence database. S605, outputting the writing information (i.e. the writing performance parameter) corresponding to the writing test according to the writing process. For the same type of time sequence database, the writing rules designed by different table structures are different, and the writing efficiency of the corresponding writing rules is also different. As shown in table 3, it shows that the test results of the write tests performed for different table structures are different. Also for the writing of data of 6.4G size, the writing speed, time consumption, and related to the number of meters, tags, and field of the design for the three designs.
TABLE 3
- Design 1 Design 2 Design 3
Storage space (G) 6.4 6.4 6.4
Write point/s 8w 9w 8.5w
Time consumption s 3h 2.8h 2.9h
Metric number 100 100 10
Number of Tag 1000 0 1000
Field number 1 1000 100
Please refer to fig. 7, which illustrates a flowchart of a query testing method according to an embodiment of the present application. S701, first, a query rule of the data is configured, for example, a query statement is randomly generated according to a target format of the time series database. S702, reading the data file, and acquiring the target format of the time sequence database according to the data file. And S703, randomly splicing to generate a query statement according to the target format. S704, executing the query task according to the query statement, and acquiring the queried data result. S705, outputting query information (i.e. query performance parameters) corresponding to the query test according to the query process. The corresponding query efficiency is different for different table structures of the same type of time sequence database.
TABLE 4
- Design 1 Design 2 Design 3
Storage space (G) 6.4 6.4 6.4
Write point/s 8w 9w 8.5w
Time consumption s 3h 2.8h 2.9h
Metric number 100 100 10
Number of Tag 1000 0 1000
Field number 1 1000 100
As shown in table 4, which shows that the query performance is different for different table structures. The different table structures have different median, average, 90% and 99% response time for the request processing times per second, namely, the query performance is obviously different.
When a large batch of performance tests are carried out on the time sequence database, tools such as a meter, a log and the like can be integrated and used for write-in tests and query tests respectively, system resources are monitored in the test process, the occupancy rates of a CPU and a memory are obtained and used as a part of performance indexes, and different schemas (table structures) can be flexibly adjusted to carry out the performance tests.
In summary, in the solution shown in the embodiment of the present application, after receiving a test instruction indicating to perform a write test and a read test on a time sequence database, a format corresponding to the time sequence database is obtained, and the time sequence database is tested according to data in a target format to obtain a write performance parameter and a query performance parameter respectively indicating write performance and query performance of the time sequence database. By the scheme, after the test designation of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, the query performance parameters and the write performance parameters of the time sequence database are obtained, namely the influence of the data format of the time sequence database on query and write is considered at the same time, and the test accuracy is improved.
Please refer to fig. 8, which is a block diagram illustrating a write test method according to an exemplary embodiment. The writing method is performed by the test apparatus 800 and the data storage apparatus 810 together. As shown in fig. 8, the test device 800 receives the write test command 801, and triggers the test device 800 to start a write test process, at which time the test device 800 starts listening to the data storage device 810 to obtain real-time status information of the data storage device.
At this time, the test device generates target data 802 having a target format from the existing data or the simulation data according to a target format corresponding to the time sequence database according to the existing data in the test device or the simulation data generated according to a preset rule, and performs a write operation on the target data having the target format according to a write rule of the time sequence database, at this time, the test device is in a monitoring state for the data storage device all the time.
After the test equipment writes all the target data 802 in the target format into the time sequence database, the test equipment stops monitoring the time sequence database, and obtains write performance parameters 803 monitored from the equipment corresponding to the time sequence database in the write test process, wherein the write performance parameters 803 include write time, write speed, memory occupancy rate of the equipment, CPU occupancy rate and the like, and the write performance of the time sequence database is judged according to the write performance parameters 803.
Reference is now made to FIG. 9, which is a block diagram illustrating a query testing methodology, according to an exemplary embodiment. The query testing method is performed by the testing device 900 in conjunction with the data storage device 910, where the testing device 900 and the data storage device 910 may be servers. As shown in fig. 9, when the test device 900 receives the query test instruction 901, the test device 900 is triggered to start a query test process, and at this time, the test device 900 starts to monitor the data storage device 910 to obtain real-time status information of the data storage device.
When the query test is started, the test device may first query the data files in the time series database, obtain a target format 902 corresponding to the time series database, perform random splicing according to the fields in the target format, randomly generate a query request 903, perform a data query operation on the time series database according to the query request 903, to obtain corresponding query data 904, and at this time, the test device is in a monitoring state for the data storage device all the time.
When the test equipment reaches the test stop condition, the query test of the time sequence database is stopped, the monitoring of the equipment corresponding to the time sequence database is stopped, and the query performance parameters 905 in the query test process are obtained. The query performance parameters include write time, write speed, memory occupancy rate of the device, CPU occupancy rate, and the like, and the query performance of the time sequence database is determined according to the query performance parameters 905. And because the query request is obtained by randomly splicing the fields in the target format, repeated query splicing is not easy to be performed on certain data for a larger time sequence database, and the occurrence of inaccurate test caused by equipment caching in the query process is reduced.
Fig. 10 is a block diagram illustrating a structure of a sequential database testing apparatus according to an exemplary embodiment. The time-series database testing device can implement all or part of the steps in the method provided by the embodiment shown in fig. 2 or fig. 3. The time series database test device may include:
a test instruction receiving module 1001, configured to receive a test instruction for a time series database, where the test instruction is used to instruct performance testing on the time series database; the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time;
a target format obtaining module 1002, configured to obtain a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
a performance parameter obtaining module 1003, configured to test the time sequence database according to the target data in the target format, and obtain a performance parameter of the time sequence database; the performance parameters comprise at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when query testing is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
In a possible implementation manner, the write performance parameter includes at least one of storage time, number of errors, request processing times per second, memory occupancy, and processor occupancy;
the query performance parameters include at least one of a number of errors, a response time, a number of request processes per second, a memory occupancy, and a processor occupancy.
In a possible implementation manner, the performance parameter obtaining module 1003 is configured to perform query test on the target data in the target format in the time series database to obtain the query performance parameter.
In one possible implementation, the data table structure includes at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the query performance parameter obtaining module comprises:
a query identifier obtaining unit, configured to generate a query identifier according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type;
and the query performance parameter acquisition unit is used for querying the target data in the target format in the time sequence database according to the query identifier to acquire the query performance parameters.
In a possible implementation manner, the query identifier obtaining unit is configured to randomly select one data attribute from the at least one data attribute type;
and generating the query identifier according to the data attribute selected from the at least one data attribute type.
In a possible implementation manner, the performance parameter obtaining module 1003 is configured to perform a write test on the target time sequence database according to the target data in the target format, and obtain a write performance parameter of the time sequence database.
In one possible implementation, the apparatus further includes:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
In summary, in the solution shown in the embodiment of the present application, after receiving a test instruction indicating to perform a write test and a read test on a time sequence database, a format corresponding to the time sequence database is obtained, and the time sequence database is tested according to data in a target format to obtain a write performance parameter and a query performance parameter respectively indicating write performance and query performance of the time sequence database. By the scheme, after the test designation of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, the query performance parameters and the write performance parameters of the time sequence database are obtained, namely the influence of the data format of the time sequence database on query and write is considered at the same time, and the test accuracy is improved.
FIG. 11 is a block diagram illustrating a computer device in accordance with an exemplary embodiment. The computer device may be implemented as the model search device and/or the image segmentation device in the various method embodiments described above. The computer device 1100 includes a Central Processing Unit (CPU) 1101, a system Memory 1104 including a Random Access Memory (RAM) 1102 and a Read-Only Memory (ROM) 1103, and a system bus 1105 connecting the system Memory 1104 and the Central Processing Unit 1101. The computer device 1100 also includes a basic input/output system 1106, which facilitates transfer of information between devices within the computer, and a mass storage device 1107 for storing an operating system 1113, application programs 1114, and other program modules 1115.
The mass storage device 1107 is connected to the central processing unit 1101 through a mass storage controller (not shown) that is connected to the system bus 1105. The mass storage device 1107 and its associated computer-readable media provide non-volatile storage for the computer device 1100. That is, the mass storage device 1107 may include a computer-readable medium (not shown) such as a hard disk or Compact disk Read-Only Memory (CD-ROM) drive.
Without loss of generality, the computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, flash memory or other solid state storage technology, CD-ROM, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1104 and mass storage device 1107 described above may be collectively referred to as memory.
The computer device 1100 may connect to the internet or other network devices through the network interface unit 1111 that is connected to the system bus 1105.
The memory further includes one or more programs, the one or more programs are stored in the memory, and the central processing unit 1101 implements all or part of the steps of the method shown in fig. 3, 4 or 9 by executing the one or more programs.
In an exemplary embodiment, there is also provided a non-transitory computer readable storage medium comprising instructions, such as a memory comprising computer programs (instructions), which are executable by a processor of a computer device to perform the methods illustrated in the various embodiments of the present application, the methods performed by a server or a user terminal. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods shown in the various embodiments described above.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method for testing a time series database, the method comprising:
receiving a test instruction for a time sequence database, wherein the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time;
acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database; the performance parameters comprise at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when query testing is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
2. The method of claim 1, wherein the write performance parameters include at least one of storage elapsed time, number of errors, number of requests processed per second, memory usage, and processor usage;
the query performance parameters include at least one of a number of errors, a response time, a number of request processes per second, a memory occupancy, and a processor occupancy.
3. The method of claim 1, wherein the testing the time-series database according to the target data in the target format to obtain the performance parameters of the time-series database comprises:
and performing query test on the target data in the target format in the time sequence database to obtain the query performance parameters.
4. The method of claim 3, wherein the data table structure contains at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data;
the query test of the target data in the target format in the time sequence database to obtain the query performance parameters includes:
generating a query identifier according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type;
and querying the target data in the target format in the time sequence database according to the query identifier to obtain the query performance parameters.
5. The method of claim 4, wherein generating a query identifier based on the at least one data attribute type comprises:
randomly selecting one data attribute from the at least one data attribute type;
and generating the query identifier according to the data attribute selected from the at least one data attribute type.
6. The method of claim 1, wherein the testing the time-series database according to the target data in the target format to obtain the performance parameters of the time-series database comprises:
and according to the target data in the target format, performing write-in test on the target time sequence database to obtain write-in performance parameters of the time sequence database.
7. The method of claim 2, further comprising:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
8. A time series database testing apparatus, the apparatus comprising:
the test instruction receiving module is used for receiving a test instruction of a time sequence database, and the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment in the Internet of things scene along with time;
the target format acquisition module is used for acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
the performance parameter acquisition module is used for testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database; the performance parameters comprise at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when query testing is carried out; the write performance parameter is used to indicate the performance of the timing database when performing a write test.
9. A computer device comprising a processor and a memory, the memory having stored therein at least one computer instruction that is loaded and executed by the processor to implement the method of sequential database testing according to any of claims 1 to 7.
10. A computer-readable storage medium having stored therein at least one computer instruction, which is loaded and executed by a processor, to implement the method of sequential database testing according to any one of claims 1 to 7.
CN202110851588.1A 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium Active CN113608981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110851588.1A CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110851588.1A CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113608981A true CN113608981A (en) 2021-11-05
CN113608981B CN113608981B (en) 2024-01-05

Family

ID=78305628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110851588.1A Active CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113608981B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116821A (en) * 2021-11-26 2022-03-01 山东浪潮科学研究院有限公司 Energy monitoring data storage method, equipment and medium based on time sequence database

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845914A (en) * 2018-06-29 2018-11-20 平安科技(深圳)有限公司 Generation method, electronic device and the readable storage medium storing program for executing of performance test report
CN110147319A (en) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 Data library test method, device and computer equipment
CN110275822A (en) * 2019-04-26 2019-09-24 武汉众邦银行股份有限公司 Performance test methods, device, equipment and the storage medium of application programming interfaces
WO2019178979A1 (en) * 2018-03-21 2019-09-26 平安科技(深圳)有限公司 Method for querying report data, apparatus, storage medium and server
CN111666565A (en) * 2020-06-22 2020-09-15 深圳壹账通智能科技有限公司 Sandbox simulation test method and device, computer equipment and storage medium
CN112486789A (en) * 2020-11-30 2021-03-12 建信金融科技有限责任公司 Log analysis system, method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019178979A1 (en) * 2018-03-21 2019-09-26 平安科技(深圳)有限公司 Method for querying report data, apparatus, storage medium and server
CN108845914A (en) * 2018-06-29 2018-11-20 平安科技(深圳)有限公司 Generation method, electronic device and the readable storage medium storing program for executing of performance test report
CN110147319A (en) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 Data library test method, device and computer equipment
CN110275822A (en) * 2019-04-26 2019-09-24 武汉众邦银行股份有限公司 Performance test methods, device, equipment and the storage medium of application programming interfaces
CN111666565A (en) * 2020-06-22 2020-09-15 深圳壹账通智能科技有限公司 Sandbox simulation test method and device, computer equipment and storage medium
CN112486789A (en) * 2020-11-30 2021-03-12 建信金融科技有限责任公司 Log analysis system, method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国评测 刘烨: "时序数据库性能测试方法", pages 1 - 10 *
徐化岩 等: "基于 influxDB 的工业时序数据库引擎设计", vol. 36, no. 9, pages 33 - 40 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116821A (en) * 2021-11-26 2022-03-01 山东浪潮科学研究院有限公司 Energy monitoring data storage method, equipment and medium based on time sequence database
CN114116821B (en) * 2021-11-26 2024-05-10 山东浪潮科学研究院有限公司 Energy monitoring data storage method, equipment and medium based on time sequence database

Also Published As

Publication number Publication date
CN113608981B (en) 2024-01-05

Similar Documents

Publication Publication Date Title
US11947556B1 (en) Computerized monitoring of a metric through execution of a search query, determining a root cause of the behavior, and providing a notification thereof
JP5078674B2 (en) Analysis system, information processing apparatus, activity analysis method, and program
US8145621B2 (en) Graphical representation of query optimizer search space in a database management system
US9996558B2 (en) Method and system for accessing a set of data tables in a source database
CN102687124B (en) The equipment of analysis and consult optimizer performance and method
US11403303B2 (en) Method and device for generating ranking model
KR20150076225A (en) Profiling data with location information
CN111190792B (en) Log storage method and device, electronic equipment and readable storage medium
US9286304B2 (en) Management of file storage locations
CN108958959A (en) The method and apparatus for detecting hive tables of data
CN114356921A (en) Data processing method, device, server and storage medium
CN110046155B (en) Method, device and equipment for updating feature database and determining data features
CN113918622B (en) Information tracing method and system based on block chain
CN110990447A (en) Data probing method, device, equipment and storage medium
CN113722370A (en) Data management method, device, equipment and medium based on index analysis
CN114564930A (en) Document information integration method, apparatus, device, medium, and program product
US10210234B2 (en) Linking discrete dimensions to enhance dimensional analysis
CN113608981B (en) Time sequence database test method and device, computer equipment and storage medium
US8121995B2 (en) Service search system, method, and program
US7610293B2 (en) Correlation of resource usage in a database tier to software instructions executing in other tiers of a multi tier application
CN113779261A (en) Knowledge graph quality evaluation method and device, computer equipment and storage medium
JP2002123516A (en) System and method for evaluating web site and recording medium
US20180268036A1 (en) Communication information generating apparatus, communication information generating method, recording medium, and communication management system
CN115169578A (en) AI model production method and system based on meta-space data markers
CN103098046A (en) Formatting system monitoring information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant