CN113608981B - Time sequence database test method and device, computer equipment and storage medium - Google Patents

Time sequence database test method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113608981B
CN113608981B CN202110851588.1A CN202110851588A CN113608981B CN 113608981 B CN113608981 B CN 113608981B CN 202110851588 A CN202110851588 A CN 202110851588A CN 113608981 B CN113608981 B CN 113608981B
Authority
CN
China
Prior art keywords
time sequence
sequence database
data
query
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110851588.1A
Other languages
Chinese (zh)
Other versions
CN113608981A (en
Inventor
寇伟
赵宏
陈小梦
宁德刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Envision Innovation Intelligent Technology Co Ltd
Envision Digital International Pte Ltd
Original Assignee
Shanghai Envision Innovation Intelligent Technology Co Ltd
Envision Digital International Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Envision Innovation Intelligent Technology Co Ltd, Envision Digital International Pte Ltd filed Critical Shanghai Envision Innovation Intelligent Technology Co Ltd
Priority to CN202110851588.1A priority Critical patent/CN113608981B/en
Publication of CN113608981A publication Critical patent/CN113608981A/en
Application granted granted Critical
Publication of CN113608981B publication Critical patent/CN113608981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to a time sequence database test method, a time sequence database test device, computer equipment and a storage medium, and in particular relates to the field of data processing. The method comprises the following steps: receiving a test instruction of a time sequence database, wherein the time sequence database is used for storing equipment data generated by Internet of things equipment along with time in an Internet of things scene; acquiring a target format of the time sequence database; and testing the time sequence database according to the target data in the target format to obtain the performance parameters of the time sequence database, wherein the performance parameters comprise at least one of inquiry performance parameters and writing performance parameters. According to the scheme, after the test specification of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, so that the query performance parameters and the write performance parameters of the time sequence database are obtained, namely, the influence of the data format of the time sequence database on query and write is considered, and the test accuracy is improved.

Description

Time sequence database test method and device, computer equipment and storage medium
Technical Field
The present invention relates to the field of data processing, and in particular, to a method and apparatus for testing a time-series database, a computer device, and a storage medium.
Background
With the rapid development of the internet of things technology, the problem of big data of the internet of things is highlighted, and the massive characteristics of the data of the internet of things bring great challenges to data quality control, data storage, data compression, data integration, data fusion and data query, wherein the evaluation of the performance of a time-series database is the basis of the big data storage of the internet of things.
In the related art, in the test of the time sequence database in the internet of things scenario, the test tool is generally used for performing writing and reading test on the time sequence database, and the performance quality of the time sequence database is assessed by the time consumed in the writing and reading processes of the time sequence database.
However, in the related art, when the test tool performs the write-in and read-out test on the time sequence database, the tested performance result has a larger difference from the actual service processing capability.
Disclosure of Invention
The embodiment of the application provides a time sequence database testing method, a time sequence database testing device, computer equipment and a storage medium, which can test a time sequence database according to the format of the time sequence database, and improve the testing accuracy, and the technical scheme is as follows:
in one aspect, a method for testing a time sequence database is provided, the method comprising:
Receiving a test instruction for a time sequence database, wherein the test instruction is used for indicating performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene;
acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
testing the time sequence database according to the target data of the target format to obtain the performance parameters of the time sequence database; the performance parameters include at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used for indicating the performance of the time sequence database when the write test is performed.
In yet another aspect, there is provided a time series database testing apparatus, the apparatus comprising:
the test instruction receiving module is used for receiving a test instruction of the time sequence database, wherein the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene;
The target format acquisition module is used for acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
the performance parameter acquisition module is used for testing the time sequence database according to the target data in the target format to acquire the performance parameters of the time sequence database; the performance parameters include at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used for indicating the performance of the time sequence database when the write test is performed.
In one possible implementation, the write performance parameter includes at least one of a storage time consumption, a number of errors, a number of requests per second processing, a memory occupancy, and a processor occupancy;
the query performance parameters include at least one of a number of errors, a response time, a number of requests processed per second, a memory occupancy, and a processor occupancy.
In one possible implementation manner, the performance parameter obtaining module is configured to perform a query test on the target data in the target format in the time sequence database, to obtain the query performance parameter.
In one possible implementation, the data table structure includes at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the query performance parameter acquisition module comprises:
the query identifier acquisition unit is used for generating a query identifier according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type;
the query performance parameter obtaining unit is used for inquiring the target data in the target format in the time sequence database according to the query identifier to obtain the query performance parameter.
In a possible implementation manner, the query identifier obtaining unit is configured to randomly select one data attribute from the at least one data attribute type;
the query identity is generated from data attributes selected from the at least one data attribute type.
In one possible implementation manner, the performance parameter obtaining module is configured to perform a write test on the target time sequence database according to the target data in the target format, so as to obtain a write performance parameter of the time sequence database.
In one possible implementation, the apparatus further includes:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
In yet another aspect, a computer device is provided that includes a processor and a memory having stored therein at least one computer instruction that is loaded and executed by the processor to implement the above-described time series database testing method.
In yet another aspect, a computer readable storage medium having stored therein at least one computer instruction loaded and executed by a processor to implement the above-described time series database testing method is provided.
In yet another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs the above-described time series database test method.
The technical scheme that this application provided can include following beneficial effect:
after receiving a test instruction for indicating writing test and reading test on the time sequence database, acquiring a format corresponding to the time sequence database, and testing the time sequence database according to data of a target format to acquire writing performance parameters and inquiry performance parameters respectively indicating writing performance and inquiry performance of the time sequence database. Through the scheme, after the test specification of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, and the query performance parameters and the write performance parameters of the time sequence database are obtained, namely, the influence of the data format of the time sequence database on query and write is considered, so that the test accuracy is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a timing database test system, according to an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of testing a time series database according to an exemplary embodiment;
FIG. 3 is a method flow diagram of a method for time series database testing, provided in accordance with an exemplary embodiment;
FIG. 4 is a schematic diagram of a data file format according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of a method for testing a time-series database according to an embodiment of the present application;
FIG. 6 is a flow chart of a write test method according to an embodiment of the present application;
FIG. 7 is a flow chart of a query testing method according to an embodiment of the present application;
FIG. 8 is a flow chart diagram illustrating a write test method according to an example embodiment;
FIG. 9 is a flowchart of a query testing method, according to an example embodiment;
FIG. 10 is a block diagram illustrating a timing database testing apparatus according to an exemplary embodiment;
fig. 11 is a schematic diagram of a computer device according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
Before describing the various embodiments shown in this application, several concepts related to this application will be described first:
1) Internet of things (The Internet of Things IOT)
The internet of things refers to collecting any object or process needing to be monitored, connected and interacted in real time through various devices and technologies such as various information sensors, radio frequency identification technologies, global positioning systems, infrared sensors and laser scanners, collecting various needed information such as sound, light, heat, electricity, mechanics, chemistry, biology and positions, and realizing ubiquitous connection of objects and people through various possible network access, and realizing intelligent sensing, identification and management of objects and processes. The internet of things is an information carrier based on the internet, a traditional telecommunication network and the like, and enables all common physical objects which can be independently addressed to form an interconnection network.
2) Database (Data Compression, DC)
The database is a repository for data. The storage space is large, and millions, tens of millions and hundreds of millions of data can be stored. However, the database does not store the data randomly, and there is a certain rule, otherwise the query efficiency is low. The world today is an internet world filled with data, which is filled with large amounts of data. I.e. this internet world is the data world. There are many sources of data such as travel records, consumption records, web pages browsed, messages sent, etc. In addition to text type data, images, music, sound are all data. A database is a computer software system that stores and manages data in data structures. The database is an entity which is a 'warehouse' capable of reasonably keeping data, and a user stores transaction data to be managed in the 'warehouse', and two concepts of the 'data' and the 'database' are combined into the database. Databases are new methods and techniques for data management that enable more appropriate organization of data, more convenient maintenance of data, tighter control of data, and more efficient utilization of data.
Fig. 1 is a schematic diagram illustrating a structure of a time series database test system according to an exemplary embodiment. The system comprises: data storage device 120 and test device 140.
Wherein the data storage device 120 may include a data storage module (not shown in the figure), in which data in the time series database may be stored in advance; the data storage device 120 may be directly connected to a sensor, where the sensor may be one sensor or several sensors, and the sensor generates corresponding time sequence data through a change of an external environment and sends the time sequence data to the data storage device for storage.
The test device 140 may include a data transmission module and a data processing module. The data transmission module is used for receiving the data after the query request is sent to the time sequence database; or the data transmission module is also used for sending the data processed by the data processing module to the time sequence database. The data processing module may process the data to be transmitted into data in a format corresponding to the time sequence database for transmission to the time sequence database of the data storage device.
Alternatively, the data storage device 120 may be a server, or include a plurality of servers, or be a distributed computer cluster formed by a plurality of servers, or be a virtualization platform, or be a cloud computing service center, or the like, which is not limited in this application.
The data storage device 120 is connected to the test device 140 via a communication network. Optionally, the communication network is a wired network or a wireless network.
Alternatively, the wireless network or wired network described above uses standard communication techniques and/or protocols. The network is typically the Internet, but may be any network including, but not limited to, a local area network (Local Area Network, LAN), metropolitan area network (Metropolitan Area Network, MAN), wide area network (Wide Area Network, WAN), a mobile, wired or wireless network, a private network, or any combination of virtual private networks. In some embodiments, data exchanged over the network is represented using techniques and/or formats including HyperText Mark-up Language (HTML), extensible markup Language (Extensible Markup Language, XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as secure socket layer (Secure Socket Layer, SSL), transport layer security (Transport Layer Security, TLS), virtual private network (Virtual Private Network, VPN), internet protocol security (Internet Protocol Security, IPsec), and the like. In other embodiments, custom and/or dedicated data communication techniques may also be used in place of or in addition to the data communication techniques described above.
Referring to fig. 2, a flowchart of a time-series database testing method is shown according to an exemplary embodiment. The method may be performed by a computer device, which may be the test device 140 in the embodiment shown in fig. 1. As shown in fig. 2, the flow of the time series database test method may include the following steps:
step 21, receiving a test instruction to a time sequence database; the test instruction is used for indicating performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene.
In one possible implementation, the time sequence database may receive time sequence data with a time tag generated by a sensor device in an internet of things scenario.
Wherein the time series data is mainly data collected or generated by a sensor device or the like for real-time detection, inspection and analysis. Generally, the time series data of the sensor device is generated quickly, each piece of data is required to correspond to a unique time stamp, and the number of the data to be tested is large, so that the data stored in the time series database is more likely to be a time series database with larger data quantity than a relational time series database.
Step 22, obtaining a target format of the time sequence database; the target format is determined from a data table structure of the time series database.
In one possible implementation, the target format is determined from a data table structure constructed from the time series database.
The same time sequence database can construct various different table structures according to the basic attribute of the time sequence database, in the time sequence database taking the Influxdb time sequence database as an example, a narrow table (high table) and a wide table can be constructed according to the basic attribute of the time sequence database, wherein the service performances corresponding to the time sequence databases with different data table structures can also be different.
For example, taking the Influxdb timing database as an example, the basic composition concept of the timing database needs to be introduced.
For a time series database (time series database), there is a column named time (time) in which UTC (Coordinated Universal Time ) time stamps are stored, in which time information corresponding to each time series data is recorded. For a time series database, there is a list of fields called fields (field keys) and field values. Wherein field keys are string types and are used for storing metadata, and field value is a specific value of the data in the field. In influxdb, a field must exist, but the field is not indexed. If a field is used as a query, all field values that meet the query are scanned. The tag (label) is optional, the label is composed of a tag key and a tag value, the tag value can only be of string type, and the characteristic with higher use frequency can be set as the tag when the time sequence database is queried, so that the query efficiency is improved.
For the time sequence database, there is also a monitoring index (metric) for indicating different types of time sequence data, all of which are the same from the storage point of view, but in different situations, there are some slight differences in these metrics, for example, when the acquired time sequence data is the load condition of the current system, the number of samples returned by the index changes continuously with time; when the acquired time sequence data is the accumulated use time of the processor, the sample value returned by the index is continuously increased along with the change of time.
Step 23, testing the time sequence database according to the target data of the target format to obtain the performance parameters of the time sequence database; the performance parameters include at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used to indicate the performance of the timing database when a write test is performed.
In one possible implementation manner, when the time sequence database is tested, the query performance test or the write performance test can be selected to be performed on the time sequence database according to the test instruction; when the test instruction is a query instruction, query testing can be performed on the time sequence database to obtain query performance parameters; when the test instruction is a write-in instruction, write-in test can be carried out on the time sequence database, and write-in performance parameters are obtained; the query performance parameters and the write performance parameters of the time sequence database can be obtained by setting up a plurality of threads at the same time and simultaneously carrying out query test and write test on the time sequence database.
In one possible implementation manner, in the process of testing the time sequence database, performance monitoring may be performed on the data storage device where the time sequence database is located, so as to obtain the memory occupancy rate and the CPU (central processing unit ) occupancy rate of the data storage device in the test process.
In one possible implementation manner, the memory occupancy rate and the CPU occupancy rate of the data storage device can be directly obtained through the data storage device of the time sequence database; or the acquisition request can be sent to the data storage device according to the test device, so that the memory occupancy rate and the CPU occupancy rate of the data storage device can be acquired.
In summary, in the scheme shown in the embodiment of the present application, after receiving a test instruction indicating writing test and reading test to a time sequence database, a format corresponding to the time sequence database is obtained, and according to data of a target format, the time sequence database is tested to obtain writing performance parameters and inquiry performance parameters respectively indicating writing performance and inquiry performance of the time sequence database. Through the scheme, after the test specification of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, and the query performance parameters and the write performance parameters of the time sequence database are obtained, namely, the influence of the data format of the time sequence database on query and write is considered, so that the test accuracy is improved.
Referring to fig. 3, a flowchart of a method for testing a time-series database according to an exemplary embodiment is provided. The method may be performed by a computer device, which may be the test device 140 in the embodiment shown in fig. 1. As shown in fig. 3, the time series database test method may include the steps of:
step 301, a test instruction for a time series database is received.
The test instruction is used for indicating performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene.
In one possible implementation, the test instruction is sent by the user to the test device through other computer devices; alternatively, the test instruction is generated by the user via the test device.
In one possible implementation, the test instruction is a query test instruction that indicates that query performance tests are performed on the time series database.
In one possible implementation, performance monitoring is performed on the device corresponding to the time series database in response to receiving the test instruction.
The device corresponding to the time sequence database may be a device for storing the time sequence database.
In one possible implementation, the performance monitor is configured to obtain at least one of a number of requests processed per second, a response time, and a number of errors in the query performance test for the timing database.
In one possible implementation manner, the performance monitor is further configured to obtain a memory occupancy rate and a time-varying condition of a processor occupancy rate of the device corresponding to the time-series database.
In one possible implementation, the data amount of the time sequence database is large, and the time sequence database may be a distributed time sequence database which exists on a distributed device cluster formed by a plurality of devices through a network, so that after receiving a test instruction of the time sequence database, performance monitoring can be performed on all or part of devices in the distributed device cluster.
Step 302, a target format of the time series database is obtained.
Wherein the target format is determined according to a data table structure of the time sequence database.
Before testing the time sequence database, the data format corresponding to the time sequence database, namely the structure of a data table, is required to be obtained, and the query rule of the time sequence database is determined according to the structure of the data table so as to query and test the time sequence database.
In one possible implementation, a data file of a time sequence database is read, and a target format of the time sequence database is obtained according to a data format of the data file.
Step 303, testing the time sequence database according to the target data of the target format to obtain the performance parameters of the time sequence database.
In one possible implementation, the query performance parameter is obtained by performing a query test on the target data in the target format in the time sequence database.
In one possible implementation, the query identifier is generated according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type; and according to the query identifier, querying the target data in the target format in the time sequence database to acquire the query performance parameters.
In one possible implementation, in the at least one data attribute type, each randomly selects one data attribute; the query identity is generated based on a data attribute selected from at least one data attribute type.
Wherein the data table structure comprises at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used to classify the target data.
The data table structure is the storage format of the data file corresponding to the time sequence database. Referring to fig. 4, a schematic diagram of a data file format according to an embodiment of the present application is shown. As shown in fig. 4, the data file 401 is composed of a metric 402 (metric), a time 403 (time), a tag 404 (tag), and a field 405 (field). The data attribute type is at least one of the metric (metric), time (time), tag (tag) and field (field), that is, the data table structure and the data structure type are formed by the data attribute types, and each data attribute type at least includes a data attribute corresponding to the data attribute type, for example, when the data attribute type is tag, the data attribute of at least one tag1 in the time sequence database corresponds to the tag, and in the time sequence database, the data in the time sequence database can be classified and stored according to the data attribute.
A query tag (i.e., a query statement) may be generated based on a data attribute selected from the data attribute types to query the target data stored in the temporal database. Taking a query statement "select" age "," value "from" user "of the Influxdb time sequence database as an example, wherein" user "is a data table in the time sequence database to be queried, the statement is that query is" age "and" v alue "in the table" user ", and" age "is a tag label.
In one possible implementation, in the at least one data attribute type, each of the data attributes is randomly selected, and the query identifier is generated according to an attribute type corresponding to the at least one data attribute type.
Wherein the attribute type corresponding to the at least one data attribute type, i.e. randomly selected among the at least one data attribute type.
When a table structure corresponding to the time sequence database is obtained according to a data file in the time sequence database, randomly selecting according to at least one of metrics (meta), time (time), tag (label) and field (field) corresponding to the table structure, obtaining a corresponding attribute type, and splicing the randomly selected attribute types to obtain a query condition corresponding to a query test. For example, when the metrics corresponding to the table structure and the tags are randomly spliced, the metrics corresponding to the table structure may be "a" and "B", the tag type may be "tag1", "tag2", and "tag3", the corresponding query conditions may be "meta is a and ta is tag1", "meta is a and tag is tag2", "meta is a and tag is tag3", "meta is B, and tag is tag1", "meta is B, and tag is tag2", "meta is C, and tag is tag3", and when the number of data attribute types selected is greater, the number of data attribute types corresponding to the data attribute types is greater, the possible query conditions generated are greater, and the corresponding data table may have multiple tags at the same time, for example, for a wide table, the tag tags corresponding to each data may be all represented in the table, and different contents may be stored in the table with high query efficiency.
In the actual test, when the time sequence database is queried according to the query identification, when the time sequence database receives the query request in the implementation path on the equipment, the data corresponding to the query request is firstly transmitted from the time sequence database to the cache of the data storage equipment, and the data storage equipment transmits the data in the cache to the test equipment so as to realize data feedback of the query request.
By randomly splicing the data attributes, the obtained query identifier (i.e. query statement) can avoid the transmission of the condition as far as possible, and the influence of the cache on the real performance test of the time sequence database is reduced in the time sequence database for storing a large amount of data.
In one possible implementation, according to the query identifier, the target data in the target format in the time sequence database is queried to obtain the query performance parameter.
According to the query statement corresponding to the query identifier, acquiring data corresponding to the query statement in the time sequence database, and feeding back the performance of the time sequence database corresponding to the query process.
In one possible implementation, the query performance parameter includes at least one of a number of errors, a response time, a number of requests processed per second, a memory occupancy, and a processor occupancy.
The number of errors is the number of times of occurrence of query error conditions in the query test process; the response time is the time that the system responds to the request, and in the inquiry test, the response time is the time required by the system to inquire the corresponding data; the RPS (Requests Per Second, the number of requests processed per second), i.e., the number of requests that the system can process per second; the memory occupancy rate and the processor occupancy rate are used for indicating the load requirement of the time sequence database on the data storage device storing the time sequence database.
In one possible implementation, according to the target data in the target format, a write test is performed on the target time sequence database, and a write performance parameter of the time sequence database is obtained.
In one possible implementation, the target data in the target format may be a data file generated by existing data according to the target format of the time series database.
The test equipment firstly acquires the target format of the data file of the time sequence database, and changes the existing data table into target data of the target format according to the target format.
In another possible implementation, the target data in the target format may be a simulated data file generated according to the target format of the time series database.
The generation rule of the simulated data file may be a random generation, that is, according to the data attribute type corresponding to the target format, the corresponding data attribute is randomly generated to form the target data file conforming to the target format.
For example, when the time-series database is openTSDB, the write statement may be "put cpu 1434067467000000000 0.64host =servera region=us_west", that is, the write metric is cpu, the time stamp is 1434067467000000000, the value is 0.64, the tag corresponding to the host is serverA, and the tag corresponding to the region is data of us_west.
When the time sequence database is influxdb, the written statement may be "INSERT cpu, host=servera, region=us_west value= 0.641434067467000000000", and the meaning is the same as that of the statement of openTSDB, that is, the written metric is cpu, the timestamp is 1434067467000000000, the value is 0.64, the tag corresponding to host is serverA, and the tag corresponding to region is data of us_west.
In one possible implementation, the write performance parameter includes at least one of a storage time consumption, a number of errors, a number of requests per second, a memory occupancy, and a processor occupancy; wherein the time consuming storage is used to indicate the time spent writing the target data to the time-series database.
In one possible implementation, in response to receiving a test instruction, performance monitoring is performed on a data storage device where the time sequence database is located, and query testing and write-in testing are performed on the time sequence database at the same time, so as to obtain query performance parameters and write-in performance parameters of the time sequence database.
When the time sequence database is tested, the query performance parameters and the write performance parameters of the time sequence database in the query test and the write test can be simultaneously obtained, and the CPU occupancy rate and the memory occupancy rate corresponding to the data storage equipment where the time sequence database is located. In the actual application of the time sequence database, when the time sequence database is queried, the equipment such as a sensor can still execute the writing operation to the time sequence database, and at the moment, the time sequence database is simultaneously queried and written in to be tested, and the performance parameters of the time sequence database at the moment are acquired, so that the situation of the time sequence database in the actual application can be simulated as far as possible.
In one possible implementation, the processor occupancy rate and the memory occupancy rate of the time-series database are obtained when the time-series database is in a test state.
In one possible implementation, the performance score of the time series database is obtained according to the query performance parameter and the write performance parameter.
After the query test and the write-in test are completed, the performance of the time sequence database can be evaluated according to the acquired query performance parameters and the write-in performance parameters. The evaluation mode can be set according to the requirement of the time sequence database, for example, when the time sequence database needs a better query function, the query performance parameter can be set to be higher weight, and the combination of the query performance parameter and the write performance parameter is considered, so that the performance score of the time sequence database is finally obtained.
Referring to fig. 5, a flow chart of a time-series database testing method according to an embodiment of the present application is shown. As shown in FIG. 5, when a test instruction 500 is received, a write test 501 is started, and a query test 502 is performed. First, for the write test 501, the generation 503 of the target data may be performed according to existing data or data generated by simulation, the generation of the data file 504 corresponding to the time-series database may be performed, and the data file may be transferred to the time-series database through an API (Application Programming Interface, application program interface) interface or an HTTP (Hyper Text Transfer Protocol ) interface 505, to obtain the write performance parameter 506, which may be a time-consuming storage.
In addition, for the query test 502, according to the target format of the data file 504, the fields (for example, tag and field) corresponding to the target format are spliced, and according to the query statement obtained after the splicing, that is, the query request 507, a request is sent to the time sequence database, the data file corresponding to the query statement is obtained, and the corresponding query performance parameter 508 is obtained, where the query performance parameter may include RPS, reaction time and error number.
In the testing process, the whole-process performance monitoring can be performed on the equipment corresponding to the time sequence database, and the CPU occupancy rate and the memory occupancy rate 509 of the equipment corresponding to the time sequence database in the testing process are obtained so as to determine the load requirement capability of the time sequence database on the equipment in the running process. Based on the write performance parameters 506, query performance parameters 508, and CPU and memory occupancy 509, performance results 510 of the timing database may be obtained comprehensively.
Table 1 is a narrow table of sensor data according to embodiments of the present application.
TABLE 1
Temperature (temperature) 2020 Tag1 Value=1
Temperature (temperature) 2020 Tag2 Value=2
Wind speed 2020 Tag1 Value=3
The temperature corresponds to the wind speed, the metric corresponds to the time 2020, the Tag1 and the Tag2 are Tag tags, and the Value is a field in a time sequence database table structure.
TABLE 2
Temperature (temperature) 2020 Tag1=1,Tag2=2,Tag3=3…
Wind speed 2020 Tag1=1,Tag2=2,Tag3=3…
Table 2 is a sensor data wide table according to an embodiment of the present application. As shown in table 2, the tag labels corresponding to each data can be all shown in the table, different contents are stored in one table, and the data can be accurately queried according to the corresponding tags, so that the query efficiency is higher, but when the width of the wide table reaches a certain amount, the value of the value becomes very large, the writing is greatly influenced, the value is traversed once again every time the data is written, and the writing performance is influenced.
Therefore, the influence of different table structures on the writing and reading performances is different, and the performance of a certain time sequence database can be more accurately evaluated by performing writing and reading tests on the time sequence database.
Referring to fig. 6, a flow chart of a write test method according to an embodiment of the present application is shown. As shown in fig. 6, S601, first, a data generation rule is configured, for example, a data file is randomly generated according to a target format of a time series database. S602, generating a data file according to the generation rule of the data. S603, configuring a writing rule of the data file, namely configuring a writing statement. S604, executing a writing task according to the writing rule, and writing the generated data file into the time sequence database. S605, according to the writing process, outputting the writing information (namely the writing performance parameter) corresponding to the writing test. For the same type of time sequence database, the writing rules of different table structure designs are different, and the writing efficiency of the corresponding writing rules is also different. As shown in table 3, which shows that the test results of the write tests performed for different table structures are different. Also for writing of 6.4G size data, the write speed, time consumption of the three designs are related to the number of meta, tag and field of the design.
TABLE 3 Table 3
- Design 1 Design 2 Design 3
Storage space (G) 6.4 6.4 6.4
Write Point/s 8w 9w 8.5w
Time consuming s 3h 2.8h 2.9h
Metric number 100 100 10
Tag number 1000 0 1000
Number of fields 1 1000 100
Referring to fig. 7, a flow chart of a query testing method according to an embodiment of the present application is shown. S701, first, a query rule of data is configured, for example, a query sentence is randomly generated according to a target format of a time series database. S702, reading a data file, and acquiring a target format of the time sequence database according to the data file. S703, randomly splicing according to the target format to generate a query sentence. S704, executing the query task according to the query statement to acquire the queried data result. S705, according to the query process, outputting the query information (i.e. the query performance parameters) corresponding to the query test. Wherein, for different table structures of the same type of time sequence database, the corresponding query efficiency is also different.
TABLE 4 Table 4
- Design 1 Design 2 Design 3
Storage space (G) 6.4 6.4 6.4
Write Point/s 8w 9w 8.5w
Time consuming s 3h 2.8h 2.9h
Metric number 100 100 10
Tag number 1000 0 1000
Number of fields 1 1000 100
As shown in table 4, which shows the different query performance for different table structures. The median, average, 90% and 99% of response times are different for the number of requests processed per second, i.e. the query performance is significantly different.
When a large number of performance tests are performed on the time sequence database, tools such as jmeter and locusts can be integrated for writing test and inquiry test respectively, system resource monitoring is performed in the test process, occupancy rates of the CPU and the memory are obtained and used as a part of performance indexes, and different schemes (table structures) can be flexibly adjusted to perform the performance tests.
In summary, in the scheme shown in the embodiment of the present application, after receiving a test instruction indicating writing test and reading test to a time sequence database, a format corresponding to the time sequence database is obtained, and according to data of a target format, the time sequence database is tested to obtain writing performance parameters and inquiry performance parameters respectively indicating writing performance and inquiry performance of the time sequence database. Through the scheme, after the test specification of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, and the query performance parameters and the write performance parameters of the time sequence database are obtained, namely, the influence of the data format of the time sequence database on query and write is considered, so that the test accuracy is improved.
Referring to FIG. 8, a flow chart of a write test method is shown according to an exemplary embodiment. The writing method is performed jointly by the test device 800 and the data storage device 810. As shown in fig. 8, when the test device 800 receives a write test instruction 801, the test device 800 is triggered to start a write test procedure, and at this time, the test device 800 starts to monitor the data storage device 810, and acquires real-time status information of the data storage device.
And, at this time, the test device generates the target data 802 with the target format according to the existing data in the test device or the analog data generated according to a certain preset rule in the test device and the target format corresponding to the time sequence database, and writes the target data with the target format according to the writing rule in the time sequence database, and at this time, the test device is always in a monitoring state for the data storage device.
After the test device writes all the target data 802 in the target format into the time sequence database, the test device stops monitoring the time sequence database, and acquires the write performance parameters 803 obtained by monitoring the device corresponding to the time sequence database in the write test process, wherein the write performance parameters 803 comprise write time, write speed, memory occupancy rate of the device, CPU occupancy rate and the like, and the write performance of the time sequence database is judged according to the write performance parameters 803.
Referring to FIG. 9, a flowchart of a query testing method is shown according to an exemplary embodiment. The query testing method is performed by the testing device 900 in conjunction with the data storage device 910, where the testing 900 and the data storage device 910 may be servers. As shown in fig. 9, when the test device 900 receives the inquiry test instruction 901, the test device 900 is triggered to start an inquiry test procedure, and at this time, the test device 900 starts to monitor the data storage device 910, and acquires real-time status information of the data storage device.
When the query test is started, the test device may query the data file in the time sequence database first, obtain the target format 902 corresponding to the time sequence database, and perform random splicing according to the fields in the target format, randomly generate the query request 903, and perform the data query operation on the time sequence database according to the query request 903 to obtain the corresponding query data 904, where the test device is always in a listening state for the data storage device.
And stopping inquiring the time sequence database after the test equipment reaches the test stopping condition, stopping monitoring the equipment corresponding to the time sequence database, and acquiring the inquiring performance parameters 905 in the inquiring test process. The query performance parameters include writing time, writing speed, memory occupancy rate of the device, CPU occupancy rate, etc., and determine whether the query performance of the time-series database is good or bad according to the query performance parameters 905. And because the fields in the target format are randomly spliced to acquire the query request, repeated query splicing of certain data is not easy to be performed on a larger time sequence database, and the occurrence of inaccurate test condition caused by equipment cache in the query process is reduced.
Fig. 10 is a block diagram showing a structure of a time series database testing apparatus according to an exemplary embodiment. The timing database testing apparatus may implement all or part of the steps in the methods provided by the embodiments shown in fig. 2 or fig. 3. The time series database test device may include:
a test instruction receiving module 1001, configured to receive a test instruction for a time sequence database, where the test instruction is used to instruct performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene;
a target format obtaining module 1002, configured to obtain a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
a performance parameter obtaining module 1003, configured to test the time sequence database according to the target data in the target format, and obtain a performance parameter of the time sequence database; the performance parameters include at least one of query performance parameters and write performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used for indicating the performance of the time sequence database when the write test is performed.
In one possible implementation, the write performance parameter includes at least one of a storage time consumption, a number of errors, a number of requests per second processing, a memory occupancy, and a processor occupancy;
the query performance parameters include at least one of a number of errors, a response time, a number of requests processed per second, a memory occupancy, and a processor occupancy.
In a possible implementation manner, the performance parameter obtaining module 1003 is configured to perform a query test on the target data in the target format in the time sequence database, to obtain the query performance parameter.
In one possible implementation, the data table structure includes at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the query performance parameter acquisition module comprises:
the query identifier acquisition unit is used for generating a query identifier according to the at least one data attribute type; the query identifier is used for acquiring data corresponding to the data attribute type;
the query performance parameter obtaining unit is used for inquiring the target data in the target format in the time sequence database according to the query identifier to obtain the query performance parameter.
In a possible implementation manner, the query identifier obtaining unit is configured to randomly select one data attribute from the at least one data attribute type;
the query identity is generated from data attributes selected from the at least one data attribute type.
In a possible implementation manner, the performance parameter obtaining module 1003 is configured to perform a write test on the target time sequence database according to the target data in the target format, so as to obtain a write performance parameter of the time sequence database.
In one possible implementation, the apparatus further includes:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
In summary, in the scheme shown in the embodiment of the present application, after receiving a test instruction indicating writing test and reading test to a time sequence database, a format corresponding to the time sequence database is obtained, and according to data of a target format, the time sequence database is tested to obtain writing performance parameters and inquiry performance parameters respectively indicating writing performance and inquiry performance of the time sequence database. Through the scheme, after the test specification of the time sequence database is received, the time sequence database can be tested according to the data format in the time sequence database, and the query performance parameters and the write performance parameters of the time sequence database are obtained, namely, the influence of the data format of the time sequence database on query and write is considered, so that the test accuracy is improved.
Fig. 11 is a schematic diagram of a computer device according to an exemplary embodiment. The computer device may be implemented as a model search device and/or an image segmentation device in the various method embodiments described above. The computer apparatus 1100 includes a central processing unit (CPU, central Processing Unit) 1101, a system Memory 1104 including a random access Memory (Random Access Memory, RAM) 1102 and a Read-Only Memory (ROM) 1103, and a system bus 1105 connecting the system Memory 1104 and the central processing unit 1101. The computer device 1100 also includes a basic input/output system 1106, which helps to transfer information between various devices within the computer, and a mass storage device 1107 for storing an operating system 1113, application programs 1114, and other program modules 1115.
The mass storage device 1107 is connected to the central processing unit 1101 through a mass storage controller (not shown) connected to the system bus 1105. The mass storage device 1107 and its associated computer-readable media provide non-volatile storage for the computer device 1100. That is, the mass storage device 1107 may include a computer-readable medium (not shown) such as a hard disk or a compact disk-read Only Memory (CD-ROM) drive.
The computer readable medium may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, flash memory or other solid state memory technology, CD-ROM, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that the computer storage medium is not limited to the one described above. The system memory 1104 and mass storage device 1107 described above may be collectively referred to as memory.
The computer device 1100 may connect to the internet or other network device through a network interface unit 1111 connected to the system bus 1105.
The memory further includes one or more programs stored in the memory, and the central processor 1101 implements all or part of the steps of the methods shown in fig. 3, 4, or 9 by executing the one or more programs.
In exemplary embodiments, a non-transitory computer readable storage medium comprising instructions, such as a memory comprising a computer program (instructions) executable by a processor of a computer device to perform a method performed by a server or a user terminal in the methods shown in the various embodiments of the present application, is also provided. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In an exemplary embodiment, a computer program product or a computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the methods shown in the above embodiments.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (7)

1. A method of testing a time series database, the method comprising:
receiving a test instruction for a time sequence database, wherein the test instruction is used for indicating performance test on the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene;
acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
testing the time sequence database according to the target data of the target format to obtain the query performance parameters of the time sequence database or the query performance parameters and the write-in performance parameters; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used for indicating the performance of the time sequence database when the write test is performed;
wherein the data table structure comprises at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the step of testing the time sequence database according to the target data in the target format to obtain the query performance parameters of the time sequence database comprises the following steps:
Each randomly selecting one data attribute from the at least one data attribute type; generating a query identifier according to a data attribute selected from the at least one data attribute type, wherein the query identifier is used for acquiring data corresponding to the data attribute type; and according to the query identifier, querying the target data in the target format in the time sequence database to acquire the query performance parameters.
2. The method of claim 1, wherein the write performance parameters include at least one of memory time consumption, number of errors, number of requests processed per second, memory occupancy, and processor occupancy;
the query performance parameters include at least one of a number of errors, a response time, a number of requests processed per second, a memory occupancy, and a processor occupancy.
3. The method according to claim 1, wherein the testing the time series database according to the target data in the target format to obtain the write performance parameter of the time series database includes:
and performing write-in test on the time sequence database according to the target data in the target format, and obtaining the write-in performance parameters of the time sequence database.
4. The method according to claim 2, wherein the method further comprises:
and when the time sequence database is in a test state, acquiring the processor occupancy rate and the memory occupancy rate of the time sequence database.
5. A time series database testing apparatus, the apparatus comprising:
the test instruction receiving module is used for receiving a test instruction of the time sequence database, wherein the test instruction is used for indicating performance test of the time sequence database; the time sequence database is used for storing equipment data generated by the Internet of things equipment along with time in the Internet of things scene;
the target format acquisition module is used for acquiring a target format of the time sequence database; the target format is determined according to a data table structure of the time sequence database;
the performance parameter acquisition module is used for testing the time sequence database according to the target data in the target format to acquire the query performance parameter of the time sequence database or the query performance parameter and the write-in performance parameter; the query performance parameter is used for indicating the performance of the time sequence database when the query test is performed; the write performance parameter is used for indicating the performance of the time sequence database when the write test is performed;
Wherein the data table structure comprises at least one data attribute type; the data attribute type comprises at least one data attribute; the data attribute is used for classifying the target data; the step of testing the time sequence database according to the target data in the target format to obtain the query performance parameters of the time sequence database comprises the following steps:
each randomly selecting one data attribute from the at least one data attribute type; generating a query identifier according to a data attribute selected from the at least one data attribute type, wherein the query identifier is used for acquiring data corresponding to the data attribute type; and according to the query identifier, querying the target data in the target format in the time sequence database to acquire the query performance parameters.
6. A computer device comprising a processor and a memory having stored therein at least one computer instruction that is loaded and executed by the processor to implement the time series database testing method of any of claims 1 to 4.
7. A computer readable storage medium having stored therein at least one computer instruction that is loaded and executed by a processor to implement the time series database testing method of any of claims 1 to 4.
CN202110851588.1A 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium Active CN113608981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110851588.1A CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110851588.1A CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113608981A CN113608981A (en) 2021-11-05
CN113608981B true CN113608981B (en) 2024-01-05

Family

ID=78305628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110851588.1A Active CN113608981B (en) 2021-07-27 2021-07-27 Time sequence database test method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113608981B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116821B (en) * 2021-11-26 2024-05-10 山东浪潮科学研究院有限公司 Energy monitoring data storage method, equipment and medium based on time sequence database

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845914A (en) * 2018-06-29 2018-11-20 平安科技(深圳)有限公司 Generation method, electronic device and the readable storage medium storing program for executing of performance test report
CN110147319A (en) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 Data library test method, device and computer equipment
CN110275822A (en) * 2019-04-26 2019-09-24 武汉众邦银行股份有限公司 Performance test methods, device, equipment and the storage medium of application programming interfaces
WO2019178979A1 (en) * 2018-03-21 2019-09-26 平安科技(深圳)有限公司 Method for querying report data, apparatus, storage medium and server
CN111666565A (en) * 2020-06-22 2020-09-15 深圳壹账通智能科技有限公司 Sandbox simulation test method and device, computer equipment and storage medium
CN112486789A (en) * 2020-11-30 2021-03-12 建信金融科技有限责任公司 Log analysis system, method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019178979A1 (en) * 2018-03-21 2019-09-26 平安科技(深圳)有限公司 Method for querying report data, apparatus, storage medium and server
CN108845914A (en) * 2018-06-29 2018-11-20 平安科技(深圳)有限公司 Generation method, electronic device and the readable storage medium storing program for executing of performance test report
CN110147319A (en) * 2019-04-19 2019-08-20 平安普惠企业管理有限公司 Data library test method, device and computer equipment
CN110275822A (en) * 2019-04-26 2019-09-24 武汉众邦银行股份有限公司 Performance test methods, device, equipment and the storage medium of application programming interfaces
CN111666565A (en) * 2020-06-22 2020-09-15 深圳壹账通智能科技有限公司 Sandbox simulation test method and device, computer equipment and storage medium
CN112486789A (en) * 2020-11-30 2021-03-12 建信金融科技有限责任公司 Log analysis system, method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐化岩 等.基于 influxDB 的工业时序数据库引擎设计.计算机应用与软件.2019,第36卷(第9期),第33-40页. *

Also Published As

Publication number Publication date
CN113608981A (en) 2021-11-05

Similar Documents

Publication Publication Date Title
US11947556B1 (en) Computerized monitoring of a metric through execution of a search query, determining a root cause of the behavior, and providing a notification thereof
US11620300B2 (en) Real-time measurement and system monitoring based on generated dependency graph models of system components
CN109241141B (en) Deep learning training data processing method and device
US11403303B2 (en) Method and device for generating ranking model
CN109033206B (en) Rule matching method, cloud server and rule matching system
CN111190792B (en) Log storage method and device, electronic equipment and readable storage medium
Malensek et al. Analytic queries over geospatial time-series data using distributed hash tables
US20230024345A1 (en) Data processing method and apparatus, device, and readable storage medium
CN111666205B (en) Data auditing method, system, computer equipment and storage medium
US11663172B2 (en) Cascading payload replication
US11681707B1 (en) Analytics query response transmission
CN110046155B (en) Method, device and equipment for updating feature database and determining data features
US20190034500A1 (en) Creating dashboards for viewing data in a data storage system based on natural language requests
CN112925757A (en) Method, equipment and storage medium for tracking operation log of intelligent equipment
CN113608981B (en) Time sequence database test method and device, computer equipment and storage medium
CN110930069A (en) Data acquisition and packaging method and system, readable storage medium and computer
CN114564930A (en) Document information integration method, apparatus, device, medium, and program product
CN113722370A (en) Data management method, device, equipment and medium based on index analysis
CN113918622A (en) Information tracing method and system based on block chain
US9009161B2 (en) Data processing
US7610293B2 (en) Correlation of resource usage in a database tier to software instructions executing in other tiers of a multi tier application
US20210286782A1 (en) Data complementing system and data complementing method
CN113779261A (en) Knowledge graph quality evaluation method and device, computer equipment and storage medium
CN107357919A (en) User behaviors log inquiry system and method
CN115658680A (en) Data storage method, data query method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant