CN111506489A - Test method, system, device, server and storage medium - Google Patents

Test method, system, device, server and storage medium Download PDF

Info

Publication number
CN111506489A
CN111506489A CN201910091351.0A CN201910091351A CN111506489A CN 111506489 A CN111506489 A CN 111506489A CN 201910091351 A CN201910091351 A CN 201910091351A CN 111506489 A CN111506489 A CN 111506489A
Authority
CN
China
Prior art keywords
buried point
test
data
buried
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910091351.0A
Other languages
Chinese (zh)
Other versions
CN111506489B (en
Inventor
马倩茹
徐海峰
苗伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910091351.0A priority Critical patent/CN111506489B/en
Publication of CN111506489A publication Critical patent/CN111506489A/en
Application granted granted Critical
Publication of CN111506489B publication Critical patent/CN111506489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)
  • Investigation Of Foundation Soil And Reinforcement Of Foundation Soil By Compacting Or Drainage (AREA)

Abstract

A test method, system, device, server and storage medium are disclosed. The test equipment is connected to the device under test, on which the application under test is installed or run. The test equipment controls the tested equipment to execute the test case aiming at the tested application, monitors the embedded points encountered in the test case executing process to obtain embedded point data, and sends the obtained embedded point data and the corresponding embedded point identification information to the test server. And the test server checks the buried point data based on the buried point rule corresponding to the buried point identification information. The buried point rule may be determined based on buried point test requirements. In addition, the buried point test coverage rate can be counted based on the stored buried point data and the buried point test requirements. The method can also obtain the processed buried point data processed by at least one node of the application server side of the application to be tested, and analyze the consistency of the buried point data corresponding to the same buried point identification information and the processed buried point data. Therefore, automatic testing can be conveniently realized.

Description

Test method, system, device, server and storage medium
Technical Field
The present disclosure relates to the field of application testing, and in particular, to a testing method, system, device, server, and storage medium.
Background
In the internet, especially the mobile internet, nowadays, various applications are continuously emerging and updated. These applications often involve both clients and servers.
On the other hand, in the modern day, which is an era of big data internet, the accuracy, integrity and security of data generated or collected by various applications become important.
As such, there is an increasing demand for data testing for various applications.
However, at present, the mobile terminal data test still depends on log (log) by people for performing on-terminal test. Subsequent server-side data comparisons are also manual. The testing mode has low efficiency and high error rate, and can not meet the current data testing requirement.
Thus, there remains a need for a test solution that enables automated testing.
Disclosure of Invention
One technical problem to be solved by the present disclosure is to provide a test scheme capable of performing an automated test.
According to an aspect of the present disclosure, there is provided a test method including: receiving buried point data generated in a test process of an application to be tested and corresponding buried point identification information from first equipment; and checking the data of the buried point based on the buried point rule corresponding to the buried point identification information.
Optionally, the embedded point rule may include at least one of a trigger condition of the embedded point, a format of the embedded point data, and a value range of the embedded point data.
Optionally, the buried point rule is determined based on buried point test requirements.
Optionally, the step of verifying the buried point data comprises: and checking whether the data of the buried points conform to the buried point rule.
Optionally, the method may further include: and reporting information about the data of the buried points not conforming to the buried point rule to a defect management system in response to detecting that the data of the buried points does not conform to the buried point rule.
Optionally, the method may further include: returning a verification result of the buried point data to the first equipment; and receiving the user-adjusted verification result from the first device.
Optionally, the method may further include: acquiring buried point test requirements; and determining a buried point rule of each buried point based on the buried point test requirements.
Alternatively, the buried site test requirements may be obtained from a test request collection platform.
Optionally, the method may further include: and storing a buried point rule set, wherein the buried point rule set comprises buried point rules of all buried points related to the buried point test requirement.
Optionally, the method may further include: storing buried point data; and counting buried point test coverage based on the stored buried point data and the buried point test requirements.
Optionally, the method may further include: acquiring processed buried point data processed by at least one node of an application server of an application to be tested; and analyzing consistency of the buried point data corresponding to the same buried point identification information and the processed buried point data.
Optionally, the method may further include: the processed buried point data is obtained from a node log of at least one node.
Optionally, the method may further include: and determining the fault node based on the analysis result of the consistency.
Optionally, the at least one node may comprise a first processing node and/or a last processing node on the application service side processing link.
Optionally, the at least one node may include sls and/or ODPS.
Optionally, the method may further include: sending test result statistical data to the first device, the test result statistical data including at least one of: the ratio of buried point data that meets the buried point rule; testing the coverage rate of the buried points; and the consistency of the buried point data and the processed buried point data processed by at least one node of the application server of the application to be tested.
Optionally, the buried point identification information may include at least one of a buried point identifier (buried point ID), a version number of the application under test, a timestamp, and a buried point trigger condition.
According to a second aspect of the present disclosure, there is provided a test method performed on a first device, the first device being connected to a second device, the second device having an application under test installed or running thereon, the method comprising: controlling the second device to execute the test case for the application under test; monitoring buried points encountered in the test case execution process to obtain buried point data; and sending the obtained buried point data and the corresponding buried point identification information to the test server.
Optionally, the buried point identification information may include at least one of a buried point identifier (buried point ID), a version number of the application under test, a timestamp, and a buried point trigger condition.
Optionally, the method may further include: responding to a query request of a user for a test task, and sending the query request to a test server; and receiving and presenting a verification result which is returned by the testing server and related to the buried point data to the user.
Optionally, the method may further include: adjusting the verification result in response to the user's operation on the presented verification result; and returning the adjusted checking result to the testing server.
Optionally, the method may further include: communicating with a test server to establish a test task; and receiving a task identifier returned by the testing server, wherein the acquired buried point data is sent to the server in association with the task identifier, and the query request comprises the task identifier.
Optionally, the method may further include: receiving and presenting test result statistical data from the test server, wherein the test result statistical data comprises at least one of the following items: the ratio of buried point data that meets the buried point rule; testing the coverage rate of the buried points; and the consistency of the buried point data and the processed buried point data processed by at least one node of the application server of the application to be tested.
According to a third aspect of the present disclosure, there is provided a data processing method comprising: receiving buried point data generated in the execution process of the application and corresponding buried point identification information thereof from the first device; and checking the data of the buried point based on the buried point rule corresponding to the buried point identification information.
According to a fourth aspect of the present disclosure, there is provided a test system comprising: the first equipment is connected with second equipment for installing or running the application to be tested, controls the second equipment to execute a test case aiming at the application to be tested, monitors the embedded point to obtain embedded point data, and sends the obtained embedded point data to the server; and the server receives and stores the buried point data sent by the first equipment so as to process the data to obtain test result data, and sends the test result data to the first equipment.
According to a fifth aspect of the present disclosure, there is provided a test server, optionally comprising: the device comprises a receiving device, a first device and a second device, wherein the receiving device is used for receiving buried point data generated in the test process of the application to be tested and corresponding buried point identification information thereof from the first device; and the checking device is used for checking the data of the buried point based on the buried point rule corresponding to the buried point identification information.
According to a sixth aspect of the present disclosure, there is provided a device for performing a test, the device being connected to a second device on which an application under test is installed or run, optionally the device comprising: the measurement and control device is used for controlling the second equipment to execute the test case aiming at the application to be tested; the monitoring device is used for monitoring buried points encountered in the test case execution process to obtain the buried point data; and the sending device is used for sending the obtained buried point data and the corresponding buried point identification information to the test server.
According to a seventh aspect of the present disclosure, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon, the executable code, when executed by the processor, causing the processor to perform the testing method of the first and second aspects described above.
According to an eighth aspect of the present disclosure, there is provided a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the testing method of any one of the first to third aspects described above.
Therefore, automatic testing can be conveniently realized through the testing scheme according to the disclosure, so that the testing efficiency can be improved, and the data quality can be guaranteed.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
FIG. 1 schematically illustrates an architecture diagram of a test system according to the present disclosure.
Fig. 2 is a schematic flow diagram of a data testing method according to the present disclosure.
FIG. 3 is a schematic block diagram of a data testing server according to the present disclosure.
FIG. 4 is a schematic block diagram of a data testing apparatus according to the present disclosure.
FIG. 5 is a schematic flow chart diagram of a data testing method according to an embodiment of the present disclosure.
FIG. 6 shows a schematic structural diagram of a computing device that can be used to implement the data testing method according to an embodiment of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
[ term interpretation ]
Embedding points: recording a certain behavior or an event is a data acquisition method which is relatively frequently used at the end side
Embedding points at the service level: a service-related one-time action or event, typically triggered by a user.
And (3) system level point burying: system behavior such as power on and off, wifi on or off, installing or uninstalling applications, etc.
[ integral framework ]
FIG. 1 schematically illustrates an architecture diagram of a test system according to the present disclosure.
The system under test may include a client device 20 and an application server, which may include a plurality (e.g., N being a positive integer) of nodes 40-1, 40-2, 40-3, … …, 40-N.
These nodes may include, for example, S L S (simple Log service), DD configuration, DD server, DD hub, ODPS (open data processing service), etc. in particular, many of the nodes have log outputs in which the buried point data processed by the node may be recorded.
Generally, S L S is the first hop after the application client uploads the buried point data to the application server, for example, 40-1, and the log stores the buried point data processed by the S L S node.
The ODPS is often the last node of an application server data processing chain, for example 40-N, used for structured storage of buried point data processed by the ODPS node in, for example, a database.
There may be several other nodes between the S L S node and the ODPS node, and the log records the buried point data processed by the node.
The client device 20 may be, for example, a mobile communication device, such as a cell phone or the like, on which the application under test may be installed or run. Application data from the client device 20 is processed sequentially or in parallel by a plurality of nodes 40-1, 40-2, 40-3, … …, 40-N of the application server.
Likewise, the test system may also include a client device 10 (hereinafter may be referred to as "first device", and accordingly, the client device 20 of the system under test may be referred to as "second device") and a test server 30 (also may be referred to as "test server").
In general, the client device 10 may be a computer, for example. Which may be connected to the client device 20 via a data line, such as a USB data line, or may be connected via a wireless connection, such as Wi-Fi, bluetooth, etc.
Client device 10 may have test platform software (test tool) running thereon for monitoring the installation or running of applications under test on client device 20. The client device 10 (first device) controls the client device 20 (second device) to execute a test case for an application under test, monitors buried points to acquire buried point data, and transmits the acquired buried point data to a server.
The client device 10 is communicatively connected to the test server 30, transmits data monitored from the client device 20 to the test server 30, and communicates with the test server 30 to perform test task management and the like.
The test server 30 receives and stores the buried point data transmitted by the first device 10 so as to perform processing to obtain test result data, and transmits the test result data to the first device 10.
The second device 20 and the application server may send the buried point data via a TCP channel, for example.
The first device 10 and the test server 30 may communicate through an HTTP channel, for example, to perform task management, result query, manual verification, statistical form acquisition, and other processes. The first device 10 may be provided with a relevant page for communicating with the test server 30 via the HTTP channel as described above.
The test protocol according to the present disclosure is described below with reference to fig. 2 to 4.
[ test protocol ]
Fig. 2 is a schematic flow diagram of a data testing method according to the present disclosure.
FIG. 3 is a schematic block diagram of a data testing server according to the present disclosure.
FIG. 4 is a schematic block diagram of a data testing apparatus according to the present disclosure.
As shown in fig. 3, the test server 30 may include, for example, a (buried data) receiving device 110 and a verifying device 120. In a preferred embodiment, the test server 30 may further include a coverage statistics apparatus 130 and a consistency analysis apparatus 140.
As shown in fig. 4, the data testing apparatus (first apparatus 10) may include a measurement and control device 210, a listening device 220, and a transmitting device 230.
The instrumentation and control device 210 controls the second device 20 to execute test cases for the application under test.
The monitoring device 220 monitors the embedded point encountered in the test case execution process to obtain the embedded point data.
The sending device 230 sends the obtained buried point data and the corresponding buried point identification information to the test server.
The buried point identification information may include at least one of context information (environment information) such as a buried point ID, a version number of an application to be measured, a time stamp, and a buried point trigger.
As shown in fig. 2, in step S110, the test server 30 receives the buried point data generated in the test process of the application under test and the corresponding buried point identification information from the first device 10, for example, through the receiving device 110.
Then, in step S120, the test server 30 checks the buried point data based on the buried point rule corresponding to the buried point identification information, for example, by the checking device 120.
Here, the buried point rule may be determined based on a buried point test requirement set by the test requester, for example.
The buried point rule may be referred to as "buried point definition" and may include at least one of a trigger condition of the buried point, a format of the buried point data, a value range of the buried point data, and the like.
Based on the corresponding buried point identification information of the buried point data, the pre-stored buried point rule of the corresponding buried point can be found.
The checking of the data of the buried point here may include, for example, checking whether the data of the buried point conforms to the buried point rule of the buried point corresponding to the data of the buried point. For example, whether the trigger condition, format and value of the buried point data meet the corresponding buried point rule is judged. Whether the buried point rule is met may include, for example, one or both of correctness and legitimacy of the buried point data.
For example, when the format or value of the buried point data does not conform to the data format or value range specified by the corresponding buried point rule, the buried point data may be considered to be illegal. When the trigger condition of the buried point data does not meet the trigger condition specified in the corresponding buried point rule, the buried point data may be considered to be incorrect.
Further, in step S130, the test server 30, for example through the coverage statistical apparatus 130, may count the buried point test coverage based on the stored buried point data and the buried point test requirement, for example, after the test task is completed. That is, the percentage of actually tested buried points among the buried points required to be tested in the buried point test requirement is counted.
Further, in step S140, the test server 30, for example, through the consistency analysis device 140, may obtain the processed buried point data processed by at least one node at the application server side of the application under test, and analyze consistency between the buried point data and the processed buried point data corresponding to the same buried point identification information.
The consistency analysis process may also be performed uniformly after the test task is completed, for example.
The server may then also send test result statistics to the first device 10, which may include, for example, at least one of:
the ratio of the buried point data that meets the buried point rule, for example, the accuracy of the buried point data, that is, the ratio of the buried point data that is verified to be correct to the total buried point data, and in addition, the information of the buried point that does not meet the buried point rule may be attached;
testing the coverage rate of the buried points; and
the above data consistency.
A test method according to one embodiment of the present disclosure is described in detail below with reference to fig. 5.
[ test methods example ]
As shown in fig. 5, the test server may include four modules, a task management module, a data receiving module, a data service module, and a data storage module. It should be understood that this modular division is not exclusive and that other modular division approaches are possible. In addition, these modules may be provided on the same server, or may be provided on different servers.
1. Test requirement acquisition
For example, a test request collection platform may be provided, and a test requester such as an application developer or an application developer may initiate a test request on the test request collection platform. Wherein buried point test requirements can be set.
The test server 30 may obtain the buried site test requirements from the test request collection platform. Then, a buried point rule for each buried point that needs to be tested may be determined based on the buried point test requirements.
The embedded point rule may include at least one of a trigger condition of the embedded point, a format of the embedded point data, a value range of the embedded point data, and the like.
The test server 30 may store the buried point rule set locally or on other devices that it may access. The buried point rule set comprises buried point rules of all buried points related to the buried point test requirement.
2. Task creation
As shown in fig. 5, at step S1, the tester launches the test client software on the first device 10.
In step S2, the first device 10 communicates with, for example, a task management module of the test server 30 to establish a test task.
Then, in step S3, the first device 10 receives the task identifier (task ID) returned by the test server 30.
In this way, the obtained buried point data may be transmitted to the test server 30 in association with the task identifier during the progress of the test task. The query request requesting the query for relevant data may also include a task identifier. Therefore, the test tasks initiated by different test equipment and the different test tasks initiated by the same test equipment can be distinguished, and later-stage query and statistics are facilitated.
3. Data acquisition and upload
In step S4, the first device 10 issues a test case to the second device 20.
In step S5, the first device 10 controls the second device 20 to execute the test case for the application under test.
When the test case is executed on the second device 20, the buried point is triggered to generate buried point data. These buried point data are uploaded to the respective nodes 40-1, 40-2, 40-3, … …, 40-N of the application server.
In step S6, the first device 10 listens for buried points encountered during the execution of the test case to obtain buried point data. For example, the first device 10 may listen to the second device 20 log to find buried point data therefrom.
In step S7, the first device 10 transmits the acquired buried point data and its corresponding buried point identification information to the test server 30.
As described above, the buried point identification information may include, for example, at least one of the buried point ID, the version number of the application to be tested, the time stamp, the buried point contact spring, and the like, and the context information (environment information).
The test server 30 receives the buried point data and the corresponding buried point identification information thereof from the first device 10, for example, by its data receiving module, and stores the buried point data in step S8, for example, by its data storing module.
The buried point data may be stored in association with the task ID for later querying and statistical analysis.
4. Data check of buried point
At step S9, the data service module, for example, of the server 30 acquires the stored buried point data from the data storage module, and then performs verification. The verification result may continue to be stored by the data storage module.
Here, the buried point data may be verified based on the buried point rule corresponding to the buried point identification information.
As described above, the buried point rule may be determined based on buried point test requirements.
Based on the buried point identification information, the buried point data corresponding to which buried point the buried point rule centralized test requirement relates can be determined, and the buried point rule of the corresponding buried point is found.
As described above, the buried point rule may include at least one of a trigger condition of the buried point, a format of the buried point data, a value range of the buried point data, and the like.
The checking of the buried point data may include, for example, checking whether the buried point data conforms to a buried point rule of a buried point corresponding to the buried point data. For example, whether the trigger condition, format and value of the buried point data meet the corresponding buried point rule is judged. Whether the buried point rule is met may include, for example, one or both of correctness and legitimacy of the buried point data.
For example, the embedded point identification information uploaded along with the embedded point data includes the trigger condition of the embedded point data, and the comparison with the trigger condition in the corresponding embedded point rule can verify whether the embedded point data is correct. When the trigger condition of the buried point data does not meet the trigger condition specified in the corresponding buried point rule, the buried point data may be considered to be incorrect. If button a is clicked (trigger condition) and as a result a buried point occurs which originally corresponds to button B clicked, this buried point data is incorrect.
For example, the embedded point rule may include a format, a value range, and the like of the embedded point data, which may be determined according to the property of the embedded point. For example, the formats may include character type, integer type, floating point type, and the like. Whether the data format and the value range of the embedded point data and the rule in the embedded point rule corresponding to the embedded point data are consistent or not is judged, and whether the embedded point data is legal or not can be verified. When the format or value of the buried point data does not conform to the data format or value range specified by the corresponding buried point rule, the buried point data can be considered to be illegal.
And establishing a corresponding relation between the buried point data and the buried point rule by taking the buried point identification information as a bridge. And comparing the buried point data with the corresponding buried point rule, and automatically checking the correctness and the legality of the buried point.
In response to a query request from a user (e.g., a tester) for a specified test task at step S10, or may be periodic or timed, e.g., once every 6 seconds, at step S11, the first device 10 may send a query request to the test server 30, e.g., its data service module, requesting to query the results of the automatic verification in real time.
The query request may contain a task ID. In this way, the test server 30 can acquire and feed back data corresponding to the task ID.
At step S12, for example, the data processing module may issue a query request to the data storage module. In step S13, the data storage module may then return the verification result of step S9 to the data processing module.
At step S14, the data storage module may return the above-described verification result of the buried point data to the first device 10.
The first device 10 receives from the test server 30 and presents to the user (tester) the verification results returned by the test server regarding the buried point data.
The user (tester) checks the verification result, and if the verification result is found to have an error, for example, there is buried point data that does not comply with the buried point rule (for example, is illegal or incorrect) that is not found by the server, or the server erroneously verifies the buried point data that complies with the buried point rule as buried point data that does not comply with the buried point rule, the user (tester) may operate on the first device 10 to modify it at step S15.
In response to the user' S operation on the presented verification result, the first device 10 adjusts the verification result, and returns the adjusted verification result to the test server 30 at step S16.
The test server 30, e.g. its data service module, receives the user-adjusted verification result from the first device 10, and stores the user-adjusted verification result in step S17, e.g. by its data storage module.
In response to detecting that the buried point data does not comply with the buried point rule, at step S18, information regarding that the buried point data does not comply with the buried point rule is reported to the defect management system, for example, by the data service module. The defect management system may be associated with the test requirements collection platform or may be the same system.
Here, originally generated buried point data (for example, whether or not a predefined buried point rule is satisfied) collected directly from the device under test, i.e., the second device 20, by the test device, i.e., the first device 10, is checked by the test server 30 based on the buried point rule. In other words, the source of the data was tested.
If a buried point value is found which does not meet the buried point rule, it is likely that the application under test has a defect in the client software. Information about the point of burial data not complying with the burial point rules may be sent to an application developer or application developer, for example, by means of a defect management system, for corresponding modification.
At step S19, the testing task is stopped periodically (e.g., at a predetermined time daily), either manually by a tester, or as all test cases have been executed.
5. Statistics of coverage
Generally, after the test task ends, in step S20, the buried point test coverage is counted based on the stored buried point data and buried point test requirements.
The buried point test requirement sets a plurality of buried points to be tested, and the buried point data collected by the first device 10 from the second device 20 relates to a plurality of actual measurement buried points. And the ratio of the actually measured buried points to the buried points to be tested, namely the ratio of the number of the actually measured buried points to the number of the buried points to be tested, is the buried point test coverage rate.
The buried point test coverage is a quantitative representation of the effect of manually executing the test, and may reflect whether the design of the test case is reasonable or not, for example. The buried point test coverage rate is known through statistics, and the method has important guiding significance for subsequent test work.
When the non-tested embedded points exist or the embedded point test coverage rate is lower than a preset threshold value, feedback needs to be given to a tester, so that the tester redesigns, improves or arranges more test cases to achieve the purpose of improving the embedded point test coverage rate.
The statistics of the buried site test coverage may also be stored by a data storage module of the server 30.
In addition, screening conditions can be set according to time, version and the like, actual measurement buried point data meeting the screening conditions are found according to the screening conditions, and the coverage rate of the actual measurement buried point to the buried point to be detected in a period of time is calculated. In addition, uncovered buried points can also be counted.
6. Consistency verification
In step S21, the test server 30, for example, the data service module thereof, obtains the processed embedded data processed by at least one node of the application server of the application under test.
As described above, the server of the application under test may include a plurality of nodes capable of generating/outputting logs, such as S L S, ODPS, etc. these nodes may have, for example, the buried data processed by them recorded in their logs, the data service module of the test server 30 may obtain the buried data processed by them from the node logs of these nodes.
In step S22, the consistency of the buried point data corresponding to the same buried point identification information and the processed buried point data is analyzed.
In consistency analysis, both quantitative and detailed aspects may be checked with emphasis.
The quantity aspect refers to checking whether each node on the whole link processes the embedded point in time, no data is lost, and no message processing delay exists.
The detailed aspect refers to checking the processing of the data on the whole link without tampering the original data.
Here, by using the buried point identification information as a bridge, a correspondence relationship between buried point data (test data) uploaded from the second device 20 to the test server 30 by the first device 10 and buried point data (actual data) processed by the application service end node is established. And analyzing the consistency between the (actual) buried point data processed by each node in the actual application operation scene and the test buried point data directly acquired by the test equipment (the first equipment 10) from the tested equipment (the second equipment 20).
If the processed buried point data of a certain node is not consistent with the test buried point data, the node or the node before the node is indicated to have a fault.
Thus, the failure node can be determined based on the above consistency analysis result.
In the case that a faulty node is found, in step S23, the node fault checking result may be reported to, for example, the defect management platform and/or the test requirement acquisition platform, or a faulty node management platform that is additionally provided, for example, through the data service module, so that relevant personnel can check and remove the fault.
Theoretically, if the processed embedded data recorded in the logs of all the nodes with log output at the application server can be compared and analyzed, a very comprehensive consistency judgment result can be obtained.
However, considering the calculation workload and other factors, only some of the key or important nodes may be selected, and the comparison analysis may be performed on the processed buried point data recorded in the node log.
For example, the at least one node may comprise, for example, a first processing node and/or a last processing node on a link of processing nodes at the application service end. The processed buried point data in the log of the first processing node represents the state of the earliest buried point data uploaded to the server by the application client. The processed buried point data in the log of the last processing node represents the state of the final buried point data.
Alternatively, the at least one node may comprise, for example, S L S and/or ODPS of the application server.
7. Statistical report
Thus, in step S24, the test result statistics may be sent to the first device 10 by, for example, a data service module of the test server 30.
The first device 10 receives and presents the test result statistics from the test server 30 to the user (tester).
The test result statistics include at least one of:
the ratio (for example, the accuracy) of the buried point data that conforms to the buried point rule, and in addition, information on the buried points that do not conform to the buried point rule may be attached;
the buried point test coverage rate is the ratio of the number of actually measured buried points to the number of buried points to be tested; and
and the consistency judgment result is the consistency between the test buried point data and the processed buried point data processed by at least one node of the application server of the application to be tested.
The tester can improve the test case or feed back to the tested application developer or developer based on the test result statistic data.
The test protocol according to the present disclosure is described in detail above. Broadly speaking, the following aspects are broadly included.
1. And running a client test tool at the PC end of the test equipment, monitoring logs on the application client, and sending the monitored embedded point logs to the test server.
2. The server combines the embedded point requirement rule to check the embedded point correctness, and combines the defect management system to provide bug tracking
3. After the test task is stopped, the test server compares the data of the nodes such as the embedded data received from the test equipment (the first equipment 10) and the embedded data at the nodes such as the ODPS, checks the data consistency through the full link, and gives a report to show the test result of the test task.
4. After the test period is finished, calculating a data quality large disk and a data coverage large disk according to the ODPS data and the embedded point requirements, visually evaluating the coverage condition and the quality condition of the embedded point requirements of the test, and giving suggestions to the focus of the subsequent test work.
Wherein:
the first aspect described above may be performed by, for example, a PC program, which may be implemented in the python language. The program may execute the following logic: acquiring a system environment and parameters of second equipment; starting a task; acquiring a task id from a server; monitoring a log; reporting a log; and judging whether the task stops or not (if the task stops, the task is ended, and if the task does not stop, circularly monitoring the log-reporting the log).
The test client in the disclosure can be replaced by programs independently running on each terminal, for example, Alios, Android, ios can independently develop a log monitoring program started by a boot-strap self-starting device, and reports logs to the test server instead of monitoring and reporting of a PC terminal.
In the second aspect, the test server receives the log, compares the log with the embedded point rule in time, and checks the accuracy and validity of each service level/system level embedded point. And (5) after detecting the defect bug, opening the defect management system to automatically report the bug. A cyclic process that may include the following processes: receiving a log; acquiring a buried point rule; checking buried points; and judging whether the bug exists.
In the third aspect, the service log and the ODPS data source table log are checked for a period of time after the task is stopped. The following processes may be included: judging whether the task is stopped; stopping and starting to obtain sls logs; analyzing sls the consistency of the log with the test data; obtaining ODPS data; analyzing the consistency of the ODPS log and the test data; and outputting a statistical report.
In the fourth aspect, the coverage rate and the accuracy rate of the ODPS buried point requirement on the buried point within a period of time are calculated according to the filtering condition screening such as time, system version, project number and the like. The following processes may be included: screening conditions; finding ODPS data according to the screening conditions; combining the buried point requirements to count the buried point coverage rate and the accuracy rate; counting uncovered and incorrect buried points; and outputting the report.
To sum up, the preferred embodiment of the present disclosure discloses a buried point full link testing scheme, which may be composed of two parts: and the automatic full link verification and buried point data test of the buried point data are enhanced.
Automatic full link verification of buried point data:
the data acquired by the test server may contain two parts.
The first part is the specific rule information and the number of the embedded points, and the server compares the rule with the embedded point data uploaded by the test client to obtain the verification result of the correctness and/or the legality of the embedded points.
The second part is the environment information (or context information) reported by the test equipment along with the buried point data, such as the environment number. The server side obtains specific environment information from the configuration center through the environment number. Through the configuration, actual buried point data in the real running environment of the application to be tested is obtained through ODPS, datadriver log service and the like, so that the actual buried point data can be compared with test buried point data reported by test equipment, and consistency judgment is carried out.
And (3) testing and enhancing buried point data:
after the test period is finished, the server side automatically calculates the buried point test coverage rate and the buried point test accuracy rate in the test period according to the screening conditions (time, system version, project number and the like), gives reference for subsequent buried point test work, and can automatically collect, count and track bugs.
Therefore, the above embodiments of the present disclosure solve the aforementioned problems in the existing data testing scheme, and provide a whole set of solutions to the problems of automatic testing, automatic verification and automatic analysis of the full link, thereby ensuring the data quality and improving the testing efficiency.
The above description is made by taking as an example the application of the technical idea of the present disclosure in the field of data testing.
It should be understood that the technical idea of the present disclosure can be implemented in other scenarios outside the testing field as well. The data can be checked, for example, during the actual running of the application.
Thus, the technical idea of the present disclosure can also be implemented as a data processing method.
First, buried point data generated during execution of an application and corresponding buried point identification information thereof are received from a first device.
And checking the data of the buried points based on the buried point rule corresponding to the buried point identification information.
The manner of checking for buried point data may be the same or similar to that described above.
Similarly, the coverage rate of the buried point data may be counted and the consistency of the buried point data may be analyzed in the same or similar manner as described above. And will not be described in detail herein.
This allows the embedded data to be checked during, for example, the actual operation of the application.
FIG. 6 is a schematic structural diagram of a computing device that can be used to implement the data testing method according to an embodiment of the present invention.
Referring to fig. 6, computing device 600 includes memory 610 and processor 620.
The processor 620 may be a multi-core processor or may include a plurality of processors. In some embodiments, processor 620 may include a general-purpose host processor and one or more special coprocessors such as a Graphics Processor (GPU), a Digital Signal Processor (DSP), or the like. In some embodiments, processor 620 may be implemented using custom circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
The memory 610 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by the processor 620 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. In addition, the memory 610 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 610 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a digital versatile disc read only (e.g., DVD-ROM, dual layer DVD-ROM), a Blu-ray disc read only, an ultra-dense disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 610 has stored thereon executable code that, when processed by the processor 620, causes the processor 620 to perform the data testing methods described above.
The data testing scheme according to the present invention has been described in detail above with reference to the accompanying drawings.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (28)

1. A method of testing, comprising:
receiving buried point data generated in a test process of an application to be tested and corresponding buried point identification information from first equipment; and
and checking the data of the buried points based on the buried point rule corresponding to the buried point identification information.
2. The test method according to claim 1,
the embedded point rule comprises at least one of a triggering condition of the embedded point, a format of the embedded point data and a value range of the embedded point data.
3. The test method of claim 1, wherein the step of verifying the buried point data comprises:
and checking whether the data of the buried points accords with the buried point rule.
4. The testing method of claim 3, further comprising:
and reporting information about the data of the buried points not conforming to the buried point rule to a defect management system in response to detecting that the data of the buried points does not conform to the buried point rule.
5. The test method of claim 1, further comprising:
returning a verification result of the buried point data to the first equipment; and
and receiving the verification result adjusted by the user from the first equipment.
6. The test method of claim 1, further comprising:
acquiring buried point test requirements; and
determining the buried point rule for each buried point based on the buried point test requirements.
7. The test method according to claim 6,
the buried point test requirements are obtained from a test request collection platform.
8. The test method of claim 6, further comprising:
and storing a buried point rule set, wherein the buried point rule set comprises buried point rules of all buried points related to the buried point test requirement.
9. The test method of claim 1, further comprising:
storing the buried point data; and
and counting the buried point test coverage rate based on the stored buried point data and the buried point test requirements.
10. The test method of claim 1, further comprising:
acquiring processed buried point data processed by at least one node of an application server of the application to be tested; and
the consistency of buried point data corresponding to the same buried point identification information and the processed buried point data is analyzed.
11. The test method according to claim 10,
obtaining the processed buried point data from a node log of the at least one node.
12. The test method of claim 10, further comprising:
and determining a fault node based on the consistency analysis result.
13. The test method according to claim 10,
the at least one node comprises a first processing node and/or a last processing node on the application server side processing link.
14. The test method according to claim 10,
the at least one node comprises a simple logging service S L S and/or an open data processing service ODPS.
15. The test method of any one of claims 1 to 14, further comprising:
sending test result statistics to the first device, the test result statistics including at least one of:
the ratio of the buried point data according with the buried point rule;
testing the coverage rate of the buried points; and
and the buried point data is consistent with the processed buried point data processed by at least one node of the application server of the application to be tested.
16. The test method according to any one of claims 1 to 14,
the embedded point identification information comprises at least one of an embedded point identifier, a tested application version number, a timestamp and an embedded point trigger condition.
17. A testing method performed on a first device connected to a second device on which an application under test is installed or run, the method comprising:
controlling the second device to execute a test case for the application under test;
monitoring buried points encountered in the test case execution process to obtain buried point data; and
and sending the obtained buried point data and the corresponding buried point identification information to a test server.
18. The test method of claim 17,
the embedded point identification information comprises at least one of an embedded point identifier, a tested application version number, a timestamp and an embedded point trigger condition.
19. The test method of claim 17, further comprising:
responding to a query request of a user for a test task, and sending the query request to the test server; and
and receiving and presenting a verification result which is returned by the testing server and relates to the buried point data to a user.
20. The test method of claim 19, further comprising:
adjusting the verification result in response to a user's operation on the presented verification result; and
and returning the adjusted checking result to the test server.
21. The test method of claim 19, further comprising:
communicating with a test server to establish a test task;
receiving a task identifier returned by the testing server,
wherein the obtained buried point data is sent to the server in association with the task identifier, and the query request includes the task identifier.
22. The test method of claim 17, further comprising:
receiving and presenting test result statistical data from the test server, wherein the test result statistical data comprises at least one of the following items:
the ratio of buried point data that meets the buried point rule;
testing the coverage rate of the buried points; and
and the buried point data is consistent with the processed buried point data processed by at least one node of the application server of the application to be tested.
23. A data processing method, comprising:
receiving buried point data generated in the execution process of the application and corresponding buried point identification information thereof from the first device; and
and checking the data of the buried points based on the buried point rule corresponding to the buried point identification information.
24. A test system, comprising:
the system comprises a first device, a second device and a server, wherein the first device is connected with the second device for installing or running a tested application, controls the second device to execute a test case aiming at the tested application, monitors embedded points to obtain embedded point data, and sends the obtained embedded point data to the server; and
and the server receives and stores the buried point data sent by the first equipment so as to process the data to obtain test result data, and sends the test result data to the first equipment.
25. A test server, comprising:
the device comprises a receiving device, a first device and a second device, wherein the receiving device is used for receiving buried point data generated in the test process of the application to be tested and corresponding buried point identification information thereof from the first device; and
and the checking device is used for checking the data of the buried point based on the buried point rule corresponding to the buried point identification information.
26. A device for performing a test, the device being connected to a second device on which an application under test is installed or run, the device comprising:
the measurement and control device is used for controlling the second equipment to execute a test case aiming at the application to be tested;
the monitoring device is used for monitoring buried points encountered in the test case execution process to obtain the buried point data; and
and the sending device is used for sending the obtained buried point data and the corresponding buried point identification information to the test server.
27. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any of claims 1 to 23.
28. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1-23.
CN201910091351.0A 2019-01-30 2019-01-30 Test method, system, device, server and storage medium Active CN111506489B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910091351.0A CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910091351.0A CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Publications (2)

Publication Number Publication Date
CN111506489A true CN111506489A (en) 2020-08-07
CN111506489B CN111506489B (en) 2023-05-30

Family

ID=71877332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910091351.0A Active CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Country Status (1)

Country Link
CN (1) CN111506489B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112181821A (en) * 2020-09-24 2021-01-05 浙江大搜车软件技术有限公司 Interface test coverage detection method and device, electronic device and storage medium
CN112214407A (en) * 2020-10-10 2021-01-12 广州华多网络科技有限公司 Data verification control and execution method and corresponding device, equipment and medium
CN112603292A (en) * 2020-12-22 2021-04-06 华南理工大学 Phase selection method of surface electromyographic signals for lower limb actions
CN112925725A (en) * 2021-04-09 2021-06-08 网易(杭州)网络有限公司 Data testing method and device, readable storage medium and electronic equipment
CN113254335A (en) * 2021-05-20 2021-08-13 北京达佳互联信息技术有限公司 Test data processing method and device, server and storage medium
CN113300912A (en) * 2021-05-21 2021-08-24 湖南快乐阳光互动娱乐传媒有限公司 Equipment testing method and device and electronic equipment
CN113448834A (en) * 2020-09-25 2021-09-28 北京新氧科技有限公司 Buried point testing method and device, electronic equipment and storage medium
CN113645342A (en) * 2021-08-03 2021-11-12 杭银消费金融股份有限公司 Mobile phone mobile terminal testing device and method thereof
CN113642047A (en) * 2021-08-13 2021-11-12 上海哔哩哔哩科技有限公司 Buried point data verification method and system
CN113673997A (en) * 2021-08-27 2021-11-19 深圳鼎盛电脑科技有限公司 Visualized processing method, device, equipment and medium of fund calculation engine
CN113850987A (en) * 2020-12-11 2021-12-28 广东朝歌智慧互联科技有限公司 System for detecting product quality

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096861A1 (en) * 2003-10-31 2005-05-05 International Business Machines Corporation Late binding of variables during test case generation for hardware and software design verification
CN104348650A (en) * 2013-08-05 2015-02-11 腾讯科技(深圳)有限公司 Website monitoring method, business device and website monitoring system
CN106571949A (en) * 2016-09-23 2017-04-19 北京五八信息技术有限公司 Event tracking point processing method and apparatus
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN107133124A (en) * 2017-04-28 2017-09-05 努比亚技术有限公司 A kind of restorative procedure, data processing equipment and storage medium for not conforming to rule data
CN107196788A (en) * 2017-05-02 2017-09-22 阿里巴巴集团控股有限公司 A kind of processing method for burying point data, device, server and client
CN107688530A (en) * 2017-04-06 2018-02-13 平安科技(深圳)有限公司 Method for testing software and device
CN107832216A (en) * 2017-11-08 2018-03-23 无线生活(杭州)信息科技有限公司 One kind buries a method of testing and device
CN107870860A (en) * 2017-05-05 2018-04-03 平安科技(深圳)有限公司 Bury a checking system and method
CN108319552A (en) * 2018-02-07 2018-07-24 优信数享(北京)信息技术有限公司 One kind burying a test method, apparatus and system
CN108519862A (en) * 2018-03-30 2018-09-11 百度在线网络技术(北京)有限公司 Storage method, device, system and the storage medium of block catenary system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096861A1 (en) * 2003-10-31 2005-05-05 International Business Machines Corporation Late binding of variables during test case generation for hardware and software design verification
CN104348650A (en) * 2013-08-05 2015-02-11 腾讯科技(深圳)有限公司 Website monitoring method, business device and website monitoring system
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN106571949A (en) * 2016-09-23 2017-04-19 北京五八信息技术有限公司 Event tracking point processing method and apparatus
CN107688530A (en) * 2017-04-06 2018-02-13 平安科技(深圳)有限公司 Method for testing software and device
CN107133124A (en) * 2017-04-28 2017-09-05 努比亚技术有限公司 A kind of restorative procedure, data processing equipment and storage medium for not conforming to rule data
CN107196788A (en) * 2017-05-02 2017-09-22 阿里巴巴集团控股有限公司 A kind of processing method for burying point data, device, server and client
CN107870860A (en) * 2017-05-05 2018-04-03 平安科技(深圳)有限公司 Bury a checking system and method
CN107832216A (en) * 2017-11-08 2018-03-23 无线生活(杭州)信息科技有限公司 One kind buries a method of testing and device
CN108319552A (en) * 2018-02-07 2018-07-24 优信数享(北京)信息技术有限公司 One kind burying a test method, apparatus and system
CN108519862A (en) * 2018-03-30 2018-09-11 百度在线网络技术(北京)有限公司 Storage method, device, system and the storage medium of block catenary system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李旭;翟颖琳;: "基于数据分析平台的APP交互设计测试评估研究" *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112181821A (en) * 2020-09-24 2021-01-05 浙江大搜车软件技术有限公司 Interface test coverage detection method and device, electronic device and storage medium
CN113448834A (en) * 2020-09-25 2021-09-28 北京新氧科技有限公司 Buried point testing method and device, electronic equipment and storage medium
CN112214407A (en) * 2020-10-10 2021-01-12 广州华多网络科技有限公司 Data verification control and execution method and corresponding device, equipment and medium
CN113850987A (en) * 2020-12-11 2021-12-28 广东朝歌智慧互联科技有限公司 System for detecting product quality
CN112603292A (en) * 2020-12-22 2021-04-06 华南理工大学 Phase selection method of surface electromyographic signals for lower limb actions
CN112925725A (en) * 2021-04-09 2021-06-08 网易(杭州)网络有限公司 Data testing method and device, readable storage medium and electronic equipment
CN112925725B (en) * 2021-04-09 2024-03-15 网易(杭州)网络有限公司 Data testing method and device, readable storage medium and electronic equipment
CN113254335A (en) * 2021-05-20 2021-08-13 北京达佳互联信息技术有限公司 Test data processing method and device, server and storage medium
CN113254335B (en) * 2021-05-20 2024-04-16 北京达佳互联信息技术有限公司 Test data processing method and device, server and storage medium
CN113300912A (en) * 2021-05-21 2021-08-24 湖南快乐阳光互动娱乐传媒有限公司 Equipment testing method and device and electronic equipment
CN113300912B (en) * 2021-05-21 2022-07-26 湖南快乐阳光互动娱乐传媒有限公司 Equipment testing method and device and electronic equipment
CN113645342A (en) * 2021-08-03 2021-11-12 杭银消费金融股份有限公司 Mobile phone mobile terminal testing device and method thereof
CN113642047A (en) * 2021-08-13 2021-11-12 上海哔哩哔哩科技有限公司 Buried point data verification method and system
CN113673997A (en) * 2021-08-27 2021-11-19 深圳鼎盛电脑科技有限公司 Visualized processing method, device, equipment and medium of fund calculation engine

Also Published As

Publication number Publication date
CN111506489B (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN111506489B (en) Test method, system, device, server and storage medium
CN110046073B (en) Log collection method and device, equipment and storage medium
CN110088744B (en) Database maintenance method and system
CN109995614B (en) Alpha testing method and device
CN103678124B (en) Video surveillance platform auto-test method and device based on continuous integrated environment
CN104850495A (en) Automatic detection method and device
CN110063042B (en) Database fault response method and terminal thereof
CN113448854A (en) Regression testing method and device
CN111651358B (en) Method for generating test case, software test method, device and server
CN112733147A (en) Equipment safety management method and system
CN111666193B (en) Method and system for monitoring and testing terminal function based on real-time log analysis
CN110990289A (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN105825641A (en) Service alarm method and apparatus
CN111400171B (en) Interface testing method, system and device and readable storage medium
CN112202647A (en) Test method, device and test equipment in block chain network
CN114860619B (en) Database audit program regression testing method and device
CN110769076B (en) DNS (Domain name System) testing method and system
CN109086185B (en) Fault detection method, device and equipment of storage cluster and storage medium
CN116383025A (en) Performance test method, device, equipment and medium based on Jmeter
CN115757138A (en) Method and device for determining script abnormal reason, storage medium and electronic equipment
CN115373984A (en) Code coverage rate determining method and device
CN114490413A (en) Test data preparation method and device, storage medium and electronic equipment
CN114037539A (en) Method and device for detecting single-link failure of insurance
CN107797915B (en) Fault repairing method, device and system
CN113282504A (en) Incremental code coverage rate detection method and service development method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201130

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant