CN111506489B - Test method, system, device, server and storage medium - Google Patents

Test method, system, device, server and storage medium Download PDF

Info

Publication number
CN111506489B
CN111506489B CN201910091351.0A CN201910091351A CN111506489B CN 111506489 B CN111506489 B CN 111506489B CN 201910091351 A CN201910091351 A CN 201910091351A CN 111506489 B CN111506489 B CN 111506489B
Authority
CN
China
Prior art keywords
buried point
test
buried
data
point data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910091351.0A
Other languages
Chinese (zh)
Other versions
CN111506489A (en
Inventor
马倩茹
徐海峰
苗伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Banma Zhixing Network Hongkong Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Banma Zhixing Network Hongkong Co Ltd filed Critical Banma Zhixing Network Hongkong Co Ltd
Priority to CN201910091351.0A priority Critical patent/CN111506489B/en
Publication of CN111506489A publication Critical patent/CN111506489A/en
Application granted granted Critical
Publication of CN111506489B publication Critical patent/CN111506489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)
  • Investigation Of Foundation Soil And Reinforcement Of Foundation Soil By Compacting Or Drainage (AREA)

Abstract

A test method, system, device, server, and storage medium are disclosed. The test equipment is connected to the device under test, and the application under test is installed or run on the device under test. The test equipment controls the tested equipment to execute the test case aiming at the tested application, monitors the buried point encountered in the execution process of the test case to acquire buried point data, and sends the acquired buried point data and the corresponding buried point identification information thereof to the test server. The test server verifies the buried point data based on the buried point rule corresponding to the buried point identification information. The buried point rule may be determined based on buried point test requirements. In addition, the buried point test coverage rate can be counted based on the stored buried point data and the buried point test requirement. The method can also acquire the processed buried point data processed by at least one node of the application server of the tested application, and analyze the consistency of the buried point data corresponding to the same buried point identification information and the processed buried point data. Therefore, the automatic test can be conveniently realized.

Description

Test method, system, device, server and storage medium
Technical Field
The present disclosure relates to the field of application testing, and in particular, to a testing method, system, device, server, and storage medium.
Background
Today, the internet, especially the mobile internet, is rapidly developed, and various applications are continuously updated. These applications often involve both clients and servers.
On the other hand, the present day is a big data internet age, and the accuracy, the integrity and the security of data generated or collected by various applications become particularly important.
As such, there is an increasing need for data testing for various applications.
However, the current mobile terminal data test still uses a manual log (log) form to perform the terminal test. Subsequent server-side data comparison is also a manual process. The test mode has low efficiency and high error rate, and can not meet the current data test requirement.
Thus, there remains a need for a test protocol that enables automated testing.
Disclosure of Invention
One technical problem to be solved by the present disclosure is to provide a test solution capable of performing automated testing.
According to one aspect of the present disclosure, there is provided a test method comprising: receiving buried point data generated in a test process of an application to be tested and corresponding buried point identification information from first equipment; and verifying the buried point data based on the buried point rule corresponding to the buried point identification information.
Alternatively, the buried point rule may include at least one of a trigger condition of the buried point, a format of buried point data, and a range of values of the buried point data.
Alternatively, the buried point rule is determined based on buried point test requirements.
Optionally, the step of verifying the buried point data includes: and checking whether the buried point data accords with the buried point rule.
Optionally, the method may further include: and in response to detecting that the embedded point data does not accord with the embedded point rule, reporting information about the embedded point data not accord with the embedded point rule to the defect management system.
Optionally, the method may further include: returning a verification result of the buried point data to the first equipment; and receiving the verification result adjusted by the user from the first equipment.
Optionally, the method may further include: acquiring a buried point test requirement; and determining buried point rules for each buried point based on the buried point test requirements.
Alternatively, the embedded point test requirements may be obtained from a test request collection platform.
Optionally, the method may further include: and storing a buried point rule set, wherein the buried point rule set comprises buried point rules of all buried points related to the buried point test requirements.
Optionally, the method may further include: storing buried point data; and counting the buried point test coverage based on the stored buried point data and the buried point test requirement.
Optionally, the method may further include: acquiring processed buried data processed by at least one node of an application server of an application to be tested; and analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information.
Optionally, the method may further include: the processed buried point data is obtained from a node log of at least one node.
Optionally, the method may further include: and determining the fault node based on the analysis result of the consistency.
Optionally, the at least one node may include a first processing node and/or a last processing node on the application server processing link.
Optionally, at least one node may comprise sls and/or ODPS.
Optionally, the method may further include: transmitting test result statistics to the first device, the test result statistics including at least one of: the ratio of the buried point data conforming to the buried point rule; buried point test coverage; and the consistency of the embedded point data and the processed embedded point data processed by at least one node of the application server of the tested application.
Alternatively, the embedded point identification information may include at least one of an embedded point identifier (embedded point ID), a measured application version number, a time stamp, and an embedded point trigger condition.
According to a second aspect of the present disclosure, there is provided a test method performed on a first device, the first device being connected to a second device on which an application under test is installed or run, the method comprising: controlling the second device to execute the test case for the tested application; monitoring buried points encountered in the test case execution process to obtain buried point data; and sending the obtained buried point data and the corresponding buried point identification information thereof to a test server.
Alternatively, the embedded point identification information may include at least one of an embedded point identifier (embedded point ID), a measured application version number, a time stamp, and an embedded point trigger condition.
Optionally, the method may further include: responding to a query request of a user for a test task, and sending the query request to a test server; and receiving and presenting a verification result about the buried point data returned by the test server to the user.
Optionally, the method may further include: responding to the operation of the user on the presented verification result, and adjusting the verification result; and returning the adjusted verification result to the test server.
Optionally, the method may further include: communication with a test server to establish a test task; and receiving a task identifier returned by the test server, wherein the acquired buried point data is sent to the server in association with the task identifier, and the query request comprises the task identifier.
Optionally, the method may further include: receiving and presenting test result statistical data from a test server, wherein the test result statistical data comprises at least one of the following: the ratio of the buried point data conforming to the buried point rule; buried point test coverage; and the consistency of the embedded point data and the processed embedded point data processed by at least one node of the application server of the tested application.
According to a third aspect of the present disclosure, there is provided a data processing method, comprising: receiving buried point data generated in the execution process of the application from first equipment and corresponding buried point identification information thereof; and verifying the buried point data based on the buried point rule corresponding to the buried point identification information.
According to a fourth aspect of the present disclosure, there is provided a test system comprising: the first device is connected with a second device for installing or running the tested application, controls the second device to execute the test case aiming at the tested application, monitors the buried point to acquire buried point data, and sends the acquired buried point data to the server; and the server receives and stores the buried point data sent by the first equipment so as to process the buried point data to obtain test result data, and sends the test result data to the first equipment.
According to a fifth aspect of the present disclosure, there is provided a test server, optionally comprising: the receiving device is used for receiving the buried point data generated in the test process of the application to be tested and the corresponding buried point identification information from the first equipment; and a verification device for verifying the buried point data based on the buried point rule corresponding to the buried point identification information.
According to a sixth aspect of the present disclosure there is provided a device for performing a test, the device being connected to a second device on which an application under test is installed or run, optionally the device comprising: the measurement and control device is used for controlling the second equipment to execute the test case aiming at the tested application; the monitoring device is used for monitoring buried points encountered in the execution process of the test case to acquire buried point data; and the transmitting device is used for transmitting the acquired buried point data and the corresponding buried point identification information thereof to the test server.
According to a seventh aspect of the present disclosure, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon which, when executed by the processor, causes the processor to perform the test methods of the first and second aspects described above.
According to an eighth aspect of the present disclosure, there is provided a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the test method of any of the first to third aspects described above.
From this, through the test scheme according to this disclosure, can conveniently realize automated test to can promote efficiency of software testing, ensure data quality.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout exemplary embodiments of the disclosure.
Fig. 1 schematically illustrates an architecture diagram of a test system according to the present disclosure.
Fig. 2 is a schematic flow chart of a data testing method according to the present disclosure.
Fig. 3 is a schematic block diagram of a data testing server according to the present disclosure.
Fig. 4 is a schematic block diagram of a data testing apparatus according to the present disclosure.
Fig. 5 is a schematic flow chart diagram of a data testing method according to an embodiment of the present disclosure.
FIG. 6 illustrates a schematic diagram of a computing device that may be used to implement the data testing method described above, according to one embodiment of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
[ PREPARATION ] A method for producing a polypeptide
Burying: recording a certain action or an event is a data acquisition method which is relatively commonly used at the end side
Service level buried point: a business-related, often user-triggered, primary action or event.
System level buried point: system behavior such as on-off, wifi on or off, installing or uninstalling applications, etc.
[ overall architecture ]
Fig. 1 schematically illustrates an architecture diagram of a test system according to the present disclosure.
The system under test may include a client device 20 and an application server, which may include a plurality (e.g., N being a positive integer) of nodes 40-1, 40-2, 40-3, … …, 40-N.
These nodes may include, for example, SLSs (simple log service), DD configurations, DD servers, DD hubs, ODPS (open data processing service), etc. In particular, many of the nodes have log outputs in which buried point data processed by the node can be recorded.
Typically, the SLS is the first hop after the application client embedded point data is uploaded to the application server, e.g., 40-1, and the log stores the embedded point data processed by the SLS node.
ODPS is often the last node of the application server data processing chain, e.g., 40-N, for structurally storing the embedded point data processed by the ODPS node in, e.g., a database.
Between the SLS node and the ODPS node, there may be several other nodes, and the log records the buried point data processed by the nodes.
The client device 20 may be, for example, a mobile communication device such as a cell phone or the like on which the application under test may be installed or run. Application data from the client device 20 is processed sequentially or in parallel via a plurality of nodes 40-1, 40-2, 40-3, … …, 40-N of the application server.
Likewise, the test system may also include a client device 10 (hereinafter may be referred to as a "first device", and accordingly, the client device 20 of the system under test may be referred to as a "second device") and a test server 30 (also may be referred to as a "test server").
In general, the client device 10 may be, for example, a computer. Which may be connected to the client device 20 via a data line, such as a USB data line, or may be connected via a wireless connection, such as Wi-Fi, bluetooth, etc.
Test platform software (test tools) may be run on client device 10 to monitor the installation or operation of an application under test on client device 20. The client device 10 (first device) controls the client device 20 (second device) to execute a test case for the application under test, monitors the buried point to acquire buried point data, and transmits the acquired buried point data to the server.
The client device 10 is communicatively connected to the test server 30, transmits data monitored from the client device 20 to the test server 30, and communicates with the test server 30 for test task management or the like.
The test server 30 receives and stores buried point data transmitted from the first device 10 for processing to obtain test result data, and transmits the test result data to the first device 10.
Between the second device 20 and the application server, buried point data may be sent, for example, through a TCP channel.
The first device 10 and the test server 30 may communicate via an HTTP channel, for example, to perform task management, result query, manual verification, and statistical report acquisition. The relevant page may be set on the first device 10 to communicate with the test server 30 via the HTTP channel described above.
Test protocols according to the present disclosure are described below with reference to fig. 2 to 4.
[ test protocol ]
Fig. 2 is a schematic flow chart of a data testing method according to the present disclosure.
Fig. 3 is a schematic block diagram of a data testing server according to the present disclosure.
Fig. 4 is a schematic block diagram of a data testing apparatus according to the present disclosure.
As shown in fig. 3, the test server 30 may include, for example, a (buried point data) receiving device 110 and a verifying device 120. In a preferred embodiment, the test server 30 may further comprise coverage statistics means 130 and consistency analysis means 140.
As shown in fig. 4, the data testing device (first device 10) may include a measurement and control means 210, a listening means 220, and a transmitting means 230.
The measurement and control device 210 controls the second apparatus 20 to execute test cases for the tested application.
The monitoring device 220 monitors the buried points encountered in the test case execution process to acquire buried point data.
The transmitting device 230 transmits the acquired buried point data and the corresponding buried point identification information thereof to the test server.
The embedded point identification information may include, for example, at least one of embedded point ID, measured application version number, time stamp, embedded point trigger condition, and the like context information (environment information).
As shown in fig. 2, in step S110, the test server 30 receives, for example, the buried point data generated during the test of the application under test and the corresponding buried point identification information from the first device 10 via the receiving apparatus 110.
Then, in step S120, the test server 30 checks the buried point data based on the buried point rule corresponding to the buried point identification information, for example, by the check device 120.
Here, the buried point rule may be determined based on a buried point test requirement set by the test requester, for example.
The buried point rule may be referred to as "buried point definition", and may include at least one of a trigger condition of a buried point, a format of buried point data, a range of values of the buried point data, and the like.
Based on the buried point identification information corresponding to the buried points, the pre-stored buried point rule of the corresponding buried points can be found.
Here, the verification of the buried point data may include, for example, verifying whether the buried point data conforms to a buried point rule of a buried point to which the buried point data corresponds. For example, whether the triggering condition, format and numerical value of the buried point data accord with the corresponding buried point rule is judged. Whether the buried point rule is met may include, for example, one or both of correctness and legality of buried point data.
For example, if the format or value of the buried point data does not match the data format or value range specified by the corresponding buried point rule, the buried point data may be considered illegal. When the trigger condition of the buried point data does not meet the trigger condition specified in the corresponding buried point rule, the buried point data can be considered to be incorrect.
Further, in step S130, the test server 30 may, for example, by the coverage statistics device 130 described above, for example, after the completion of the test task, count the coverage of the buried point test based on the stored buried point data and the buried point test requirement. That is, the proportion of the buried points actually tested among the buried points required to be tested in the buried point test requirements is counted.
Further, in step S140, the test server 30 may obtain the processed buried point data processed by at least one node of the application server of the application under test, for example, through the above consistency analysis device 140, and analyze the consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information.
The consistency analysis processing may be performed uniformly after the completion of the test task, for example.
The server may then also send test result statistics to the first device 10, which may include, for example, at least one of:
The ratio of the buried point data conforming to the buried point rule, for example, the correct ratio of the buried point data, that is, the proportion of the buried point data which is checked to be correct to the whole buried point data, may be attached with the information of the buried point which does not conform to the buried point rule;
buried point test coverage; and
the data consistency is as above.
A test method according to one embodiment of the present disclosure is described in detail below with reference to fig. 5.
[ test method example ]
As shown in fig. 5, the test server may include four modules, a task management module, a data receiving module, a data service module, and a data storage module. It should be appreciated that this division of modules is not exclusive and that other manners of division of modules are possible. In addition, these modules may be provided on the same server or may be provided on different servers.
1. Test requirement acquisition
For example, a test request collection platform may be provided on which a test request may be initiated by a test requester, such as an application developer or application developer. Wherein buried point test requirements may be set.
Test server 30 may obtain embedded point test requirements from a test request collection platform. Then, the burial point rules for each burial point to be tested may be determined based on the burial point test requirements.
The buried point rule may include at least one of a trigger condition of the buried point, a format of buried point data, a range of values of the buried point data, and the like.
Test server 30 may store the buried point rule set locally or on other devices that it has access to. The buried point rule set includes buried point rules of each buried point related to the buried point test requirement.
2. Task creation
As shown in fig. 5, at step S1, a tester starts test client software on the first device 10.
In step S2, the first device 10 communicates with, for example, a task management module of the test server 30 to establish a test task.
Then, in step S3, the first device 10 receives a task identifier (task ID) returned from the test server 30.
In this way, the acquired buried point data may be transmitted to the test server 30 in association with the task identifier during the test task. The query request requesting the query-related data may also include a task identifier. Therefore, the test tasks initiated by different test devices and the different test tasks initiated by the same test device can be distinguished, and the later inquiry and statistics are facilitated.
3. Data acquisition and upload
In step S4, the first device 10 issues a test case to the second device 20.
In step S5, the first device 10 controls the second device 20 to execute a test case for the application under test.
When the test case is executed on the second device 20, the embedded point is triggered, and embedded point data is generated. These buried data are uploaded to the respective nodes 40-1, 40-2, 40-3, … …, 40-N of the application server.
In step S6, the first device 10 monitors the buried point encountered in the test case execution process to acquire the buried point data. For example, the first device 10 may monitor the log of the second device 20 to discover buried point data therefrom.
In step S7, the first device 10 transmits the acquired buried point data and the corresponding buried point identification information thereof to the test server 30.
As described above, the embedded point identification information may include at least one of the embedded point ID, the measured application version number, the time stamp, the embedded point trigger condition, and the like context information (environment information), for example.
The test server 30 receives the buried point data and its corresponding buried point identification information from the first device 10, for example, through its data receiving module, and stores the buried point data, for example, through its data storing module, at step S8.
The buried point data may be stored in association with the task ID for later querying and statistical analysis.
4. Buried data verification
In step S9, the data service module of the server 30, for example, acquires the stored buried point data from the data storage module, and then performs verification. The verification result may continue to be stored by the data storage module.
Here, the buried point data may be verified based on the buried point rule corresponding to the buried point identification information.
As described above, the buried point rule may be determined based on buried point test requirements.
Based on the buried point identification information, it is possible to determine which buried point the buried point corresponds to in relation to the test requirement in the buried point rule set, and find the buried point rule of the corresponding buried point.
As described above, the buried point rule may include at least one of a trigger condition of the buried point, a format of buried point data, a range of values of the buried point data, and the like.
The verification of the buried point data may include, for example, verifying whether the buried point data conforms to a buried point rule of a buried point corresponding to the buried point data. For example, whether the triggering condition, format and numerical value of the buried point data accord with the corresponding buried point rule is judged. Whether the buried point rule is met may include, for example, one or both of correctness and legality of buried point data.
For example, if the trigger condition of the buried point data is included in the buried point identification information uploaded by the buried point data, it can be checked whether the buried point data is correct or not, compared with the trigger condition in the corresponding buried point rule. When the trigger condition of the buried point data does not meet the trigger condition specified in the corresponding buried point rule, the buried point data can be considered to be incorrect. If button A is clicked (trigger condition), the result appears a buried point that would otherwise correspond to button B being clicked, this buried point data is incorrect.
For another example, the buried point rule is a rule that specifies the format, value range, and the like of the buried point data, and these may be determined according to the nature of the buried point. For example, the format may include character type, integer type, floating point type, and the like. And judging whether the buried data accords with the data format and the value range of the rule in the corresponding buried rule, and checking whether the buried data is legal or not. If the format or value of the buried point data does not conform to the data format or value range specified by the corresponding buried point rule, the buried point data may be considered illegal.
By taking the buried point identification information as a bridge, a corresponding relationship between buried point data and buried point rules is established. The correctness and the legality of the buried points can be automatically checked by comparing the buried point data with the corresponding buried point rules.
In response to a query request by a user (e.g., a tester) for a specified test task at step S10, or alternatively, periodically or periodically, e.g., once every 6 seconds, the first device 10 may send a query request to the test server 30, e.g., its data service module, requesting real-time query for the results of the automatic verification at step S11.
The query request may contain a task ID. In this way, the test server 30 can acquire and feed back data corresponding to the task ID.
In step S12, for example, the data processing module may issue a query request to the data storage module. In step S13, the data storage module may return the verification result of step S9 to the data processing module.
In step S14, the data storage module may return the above-described verification result of the buried point data to the first device 10.
The first device 10 receives and presents the verification result about the buried point data returned from the test server to the user (tester) from the test server 30.
The user (tester) checks the check result, and if it is found that there is an error in the check result, for example, there is buried point data which is not found by the server to be in accordance with the buried point rule (for example, illegal or incorrect), or the server erroneously checks buried point data which is in accordance with the buried point rule as buried point data which is not in accordance with the buried point rule, the user (tester) may operate on the first device 10 to make a modification in step S15.
In response to the user' S operation on the presented verification result, the first device 10 adjusts the verification result, and returns the adjusted verification result to the test server 30 in step S16.
The test server 30, e.g. its data service module, receives the user-adapted verification result from the first device 10 and stores the user-adapted verification result in step S17, e.g. by its data storage module.
In response to detecting that the buried data does not comply with the buried rule, in step S18, information about the buried data not complying with the buried rule is reported to the defect management system, e.g. by the data service module. The defect management system may be associated with the test requirements collection platform or may be the same system.
Here, the initially generated buried point data (e.g., whether or not the predefined buried point rule is met) collected by the test device, i.e., the first device 10, directly from the device under test, i.e., the second device 20, is verified by the test server 30 based on the buried point rule. In other words, the data source was tested.
If a buried point value which does not meet the buried point rule is found, the tested application is likely to have defects in the client software. Information about the non-compliance of the buried point data with the buried point rules may be given to the application developer or application developer, e.g. by means of a defect management system, for a corresponding modification.
In step S19, the test task is stopped periodically (e.g., a predetermined time per day), manually by a tester, or as all test cases are executed.
5. Coverage statistics
Generally, after the test task is finished, in step S20, the buried point test coverage is counted based on the stored buried point data and the buried point test requirement.
The buried point test requirement sets a plurality of buried points to be tested, and the buried point data collected by the first device 10 from the second device 20 relates to a plurality of actually measured buried points. The ratio of the measured buried points to the buried points to be measured, namely the ratio of the measured buried points to the measured buried points, is the buried point test coverage rate.
The buried point test coverage rate is the quantitative display of the manual execution test effect, and can reflect whether the design of the test case is reasonable or not, for example. The buried point test coverage rate is known through statistics, so that the method has important guiding significance for subsequent test work.
When there are buried points which are not tested, or the test coverage rate of the buried points is lower than a preset threshold value, feedback is needed to the tester so that the tester redesigns, improves or arranges more test cases to achieve the purpose of improving the test coverage rate of the buried points.
Statistics of the buried point test coverage may also be stored by the data storage module of the server 30.
In addition, screening conditions can be set according to time, version and the like, actual measurement buried point data meeting the screening conditions can be found according to the screening conditions, and coverage rate of the actual measurement buried points to the buried points to be detected in a period of time is calculated. In addition, uncovered buried points can also be counted.
6. Consistency verification
In step S21, the test server 30, for example, a data service module thereof, acquires the processed embedded data processed by at least one node of the application server of the application under test.
As described above, the server side of the application under test may include a plurality of nodes capable of generating/outputting logs, such as SLSs, ODPS, and the like. These nodes record the buried point data processed by them in their logs, for example. The data service module of the test server 30 may obtain the buried point data processed by the nodes from the node logs of the nodes.
In step S22, consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information is analyzed.
In the consistency analysis, both the number and the detail may be emphasized.
The number aspect is to check whether each node on the whole link processes the buried point in time, and has no data loss and no message processing lag.
The detail aspect is to check the processing of the data on the whole link, and the original data cannot be tampered.
Here, by taking the buried point identification information as a bridge, a correspondence relationship is established between the buried point data (test data) uploaded from the second device 20 to the test server 30 by the first device 10 and the buried point data (actual data) processed by the application server node. And analyzing the consistency between the (actual) buried point data processed by each node in the actual application operation scene and the test buried point data directly acquired from the tested device (the second device 20) by the test device (the first device 10).
If the processed buried point data of a certain node is inconsistent with the test buried point data, the node or the node before the node is indicated to have a fault.
Thus, the failure node can be determined based on the above-described consistency analysis result.
In case a faulty node is found, in step S23, the node fault check result may be reported, for example, to the above-mentioned defect management platform and/or test requirement collection platform, or a fault node management platform that is additionally provided, through a data service module, so that a related person checks for troubleshooting.
In theory, if the processed buried point data recorded in the logs of all the nodes having log output at the application server can be compared and analyzed, a very comprehensive consistency judgment result can be obtained.
However, considering the calculation workload, only some of the key or important nodes may be selected, and the processed buried data recorded in the node log thereof may be compared and analyzed.
The at least one node may comprise, for example, a first processing node and/or a last processing node on a link of the application server processing node. The processed embedded data in the log of the first processing node represents the state of the earliest embedded data uploaded by the application client to the server. The processed buried point data in the log of the last processing node then represents the state of the final buried point data.
Alternatively, the at least one node may include, for example, SLS and/or ODPS of the application server.
7. Statistical report
Thus, at step S24, test result statistics may be sent to the first device 10 by, for example, a data service module of the test server 30.
The first device 10 receives and presents test result statistics from the test server 30 to the user (tester).
The test result statistics include at least one of:
the ratio (e.g., accuracy) of the buried point data conforming to the buried point rule may be added with information of buried points not conforming to the buried point rule;
the buried point test coverage rate is the ratio of the number of the actually measured buried points to the number of the buried points to be tested; and
and a consistency judgment result, namely testing the consistency of the embedded point data and the processed embedded point data processed by at least one node of the application server of the tested application.
The tester can improve the test cases or feed back to the application developer or developers under test based on the test result statistics.
The test protocols according to the present disclosure are described in detail above. Broadly, this includes the following aspects.
1. And running a client testing tool at the PC end of the testing equipment, monitoring the log on the application client, and sending the monitored buried point log to the testing server.
2. The server combines with the buried point demand rule, checks the buried point correctness, and combines with the defect management system to provide bug tracking
3. After the test task is stopped, the test server compares the buried point data received from the test equipment (the first equipment 10) with the buried point data at the nodes such as ODPS and the like, verifies the consistency of the data of the whole link, and gives a report to show the test result of the test task.
4. After the test period is finished, calculating a data quality large disc and a data coverage large disc according to ODPS data and buried point requirements, visually evaluating the coverage condition and quality condition of the test on the buried point requirements, and giving suggestions to the emphasis of subsequent test work.
Wherein:
the first aspect described above may be performed by, for example, a PC program, which may be implemented in the python language. The program may execute the following logic: acquiring a system environment and parameters of the second equipment; starting a task; acquiring a task id from a server; monitoring log; reporting log; judging whether the task is stopped (ending if stopping, and circularly monitoring log-reporting log if not stopping).
The test client in the disclosure can be replaced by a program independently running on each end, such as alios and Android, the ios can independently develop a log monitoring program started by starting up and report the log to the test server to replace monitoring and reporting of the PC end.
In the second aspect, the test server receives the log, compares the log with the embedded point rule in time, and verifies the accuracy and validity of each service level/system level embedded point. And after the defect bug is detected, opening the defect management system to automatically report the bug. A cyclic process may be included that includes: receiving a log; acquiring buried point rules; checking the buried points; judging whether bug is generated.
In the third aspect, after the task is stopped, the server log and the ODPS data source table log are checked for a period of time. The following processes may be included: judging whether the task is stopped; starting to acquire sls log when stopping; analyzing the consistency of the sls log and the test data; obtaining ODPS data; analyzing consistency of ODPS logs and test data; and outputting a statistical report.
In the fourth aspect, the coverage rate and the accuracy of the ODPS buried point to the buried point requirement in a period of time are calculated according to filtering condition screening such as time, system version, project number and the like. The following processes may be included: screening conditions; finding ODPS data according to the screening conditions; counting the embedding point coverage rate and the accuracy rate by combining the embedding point requirement; counting uncovered and incorrect buried points; and outputting a report.
In summary, the preferred embodiments of the present disclosure disclose a buried point full link test scheme, which may consist of two parts: and the embedded point data is automatically verified in a full link and is enhanced in testing.
Automatic full-link verification of buried data:
the data acquired by the test server may comprise two parts.
The first part is specific buried point rule information and number, and the server side compares the buried point rule information with the buried point data uploaded by the test client side through the rule to obtain a check result of the correctness and/or the legality of the buried point.
The second part is the environmental information (or context information) such as the environmental number that the test device reported with the buried data. The server acquires specific environment information from the configuration center through the environment number. Through the configuration, the log service ODPS, datadriver and the like acquire actual embedded data in the real running environment of the tested application so as to compare the actual embedded data with the test embedded data reported by the test equipment and judge the consistency.
Buried data test enhancement:
after the test period is finished, the server side automatically calculates the embedded point test coverage rate and the correct rate in the test period according to screening conditions (time, system version, project number and the like), provides a reference for the subsequent embedded point test work, and can automatically collect statistics and track bug.
Therefore, the embodiment of the disclosure solves the problems in the prior data testing scheme, and provides a whole set of solutions for the problems of full-link automatic testing, automatic verification and automatic analysis, so that the data quality can be ensured, and the testing efficiency can be improved.
The above description has been made taking, as an example, the application of the technical idea of the present disclosure in the field of data testing.
It should be appreciated that the technical concepts of the present disclosure may be implemented in other scenarios outside of the testing arts as well. The data may be checked during the actual running of the application, for example.
Thus, the technical idea of the present disclosure may also be implemented as a data processing method.
First, buried point data generated during execution of an application and corresponding buried point identification information thereof are received from a first device.
And verifying the buried point data based on the buried point rule corresponding to the buried point identification information.
The verification of the buried point data may be the same or similar to that described above.
Likewise, the coverage of the buried point data may also be counted and analyzed for consistency in the buried point data in the same or similar manner as described above. And will not be described in detail here.
Thus, the verification of the buried point data can be implemented, for example, during the actual running of the application.
FIG. 6 is a schematic diagram of a computing device that may be used to implement the data testing method described above according to one embodiment of the invention.
Referring to fig. 6, a computing device 600 includes a memory 610 and a processor 620.
Processor 620 may be a multi-core processor or may include multiple processors. In some embodiments, processor 620 may include a general-purpose host processor and one or more special coprocessors, such as a Graphics Processor (GPU), digital Signal Processor (DSP), etc. In some embodiments, the processor 620 may be implemented using custom circuitry, for example, an application specific integrated circuit (ASIC, application Specific Integrated Circuit) or a field programmable gate array (FPGA, field Programmable Gate Arrays).
Memory 610 may include various types of storage units, such as system memory, read Only Memory (ROM), and persistent storage. Where the ROM may store static data or instructions that are required by the processor 620 or other modules of the computer. The persistent storage may be a readable and writable storage. The persistent storage may be a non-volatile memory device that does not lose stored instructions and data even after the computer is powered down. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the persistent storage may be a removable storage device (e.g., diskette, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as dynamic random access memory. The system memory may store instructions and data that are required by some or all of the processors at runtime. Furthermore, memory 610 may include any combination of computer-readable storage media including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic disks, and/or optical disks may also be employed. In some implementations, memory 610 may include readable and/or writable removable storage devices such as Compact Discs (CDs), digital versatile discs (e.g., DVD-ROMs, dual-layer DVD-ROMs), blu-ray discs read only, super-density discs, flash memory cards (e.g., SD cards, min SD cards, micro-SD cards, etc.), magnetic floppy disks, and the like. The computer readable storage medium does not contain a carrier wave or an instantaneous electronic signal transmitted by wireless or wired transmission.
The memory 610 has stored thereon executable code that, when processed by the processor 620, causes the processor 620 to perform the data testing method described above.
The data testing scheme according to the present invention has been described in detail above with reference to the accompanying drawings.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for performing the steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (23)

1. A method of testing, comprising:
receiving buried point data generated in a test process of an application to be tested and corresponding buried point identification information from first equipment; and
verifying the buried point data based on the buried point rule corresponding to the buried point identification information; the buried point rule comprises at least one of a triggering condition of a buried point, a format of buried point data and a value range of the buried point data;
the method further comprises the steps of: acquiring processed buried data processed by at least one node of an application server of the tested application; and
and analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information.
2. The method of testing of claim 1, wherein the step of verifying the buried point data comprises:
and checking whether the buried point data accords with the buried point rule.
3. The test method of claim 2, further comprising:
and in response to detecting that the embedded point data does not accord with the embedded point rule, reporting information about the embedded point data not accord with the embedded point rule to the defect management system.
4. The test method of claim 1, further comprising:
Returning a verification result of the buried point data to the first equipment; and
and receiving a verification result adjusted by the user from the first equipment.
5. The test method of claim 1, further comprising:
acquiring a buried point test requirement; and
determining the buried point rule of each buried point based on the buried point test requirement.
6. The test method according to claim 5, wherein,
the buried test requirements are obtained from a test request collection platform.
7. The method of testing of claim 5, further comprising:
and storing a buried point rule set, wherein the buried point rule set comprises buried point rules of all buried points related to the buried point test requirement.
8. The test method of claim 1, further comprising:
storing the buried point data; and
based on the stored buried point data and buried point test requirements, a buried point test coverage rate is counted.
9. The test method according to claim 1, wherein,
the processed buried point data is obtained from a node log of the at least one node.
10. The test method of claim 1, further comprising:
And determining a fault node based on the consistency analysis result.
11. The test method according to claim 1, wherein,
the at least one node comprises a first processing node and/or a last processing node on the application server processing link.
12. The test method according to claim 1, wherein,
the at least one node comprises a simple log service SLS and/or an open data processing service ODPS.
13. The test method according to any one of claims 1 to 12, further comprising:
transmitting test result statistics to the first device, the test result statistics including at least one of:
the ratio of the buried point data conforming to the buried point rule;
buried point test coverage; and
and the consistency of the embedded point data and the processed embedded point data processed by at least one node of the application server of the tested application.
14. The test method according to any one of claims 1 to 12, wherein,
the embedded point identification information comprises at least one of an embedded point identifier, a tested application version number, a time stamp and an embedded point triggering condition.
15. A test method performed on a first device, the first device being connected to a second device, the second device having an application under test installed or running thereon, the method comprising:
controlling the second device to execute a test case for the tested application;
monitoring buried points encountered in the test case execution process to obtain buried point data; and
the obtained buried point data and the corresponding buried point identification information thereof are sent to a test server;
responding to a query request of a user for a test task, and sending the query request to a test server; and
receiving and presenting a verification result about the buried point data returned by the test server to a user; the test server is used for checking the buried point data based on the buried point rule corresponding to the buried point identification information; acquiring processed buried data processed by at least one node of an application server of the tested application; and analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information; the buried point rule comprises at least one of a triggering condition of a buried point, a format of buried point data and a value range of the buried point data;
The method further comprises the steps of: communication with the test server to establish a test task;
receiving a task identifier returned by the test server,
wherein the obtained buried point data is sent to the server in association with the task identifier, and the query request includes the task identifier.
16. The test method of claim 15, wherein the test method comprises the steps of,
the embedded point identification information comprises at least one of an embedded point identifier, a tested application version number, a time stamp and an embedded point triggering condition.
17. The method of testing of claim 15, further comprising:
responding to the operation of a user on the presented verification result, and adjusting the verification result; and
and returning the adjusted verification result to the test server.
18. The method of testing of claim 15, further comprising:
receiving and presenting test result statistical data from the test server, wherein the test result statistical data comprises at least one of the following:
the ratio of the buried point data conforming to the buried point rule;
buried point test coverage; and
and the consistency of the embedded point data and the processed embedded point data processed by at least one node of the application server of the tested application.
19. A method of data processing, comprising:
receiving buried point data generated in the execution process of the application from first equipment and corresponding buried point identification information thereof; and
verifying the buried point data based on the buried point rule corresponding to the buried point identification information; the buried point rule comprises at least one of a triggering condition of a buried point, a format of buried point data and a value range of the buried point data;
the method further comprises the steps of: acquiring processed buried data processed by at least one node of an application server of an application to be tested; and
and analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information.
20. A test server, comprising:
the receiving device is used for receiving the buried point data generated in the test process of the application to be tested and the corresponding buried point identification information from the first equipment; and
the verification device is used for verifying the buried point data based on the buried point rule corresponding to the buried point identification information; the buried point rule comprises at least one of a triggering condition of a buried point, a format of buried point data and a value range of the buried point data; and
the consistency analysis device is used for acquiring the processed buried data processed by at least one node of the application server of the tested application; and
And analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information.
21. A device for performing a test, the device being connected to a second device on which an application under test is installed or run, the device comprising:
the measurement and control device is used for controlling the second equipment to execute test cases aiming at the tested application;
the monitoring device is used for monitoring buried points encountered in the execution process of the test case to acquire buried point data; and
the transmitting device is used for transmitting the acquired buried point data and the corresponding buried point identification information thereof to the test server; the method is also used for responding to the query request of the user for the test task and sending the query request to the test server; and
receiving and presenting a verification result about the buried point data returned by the test server to a user; the test server is used for checking the buried point data based on the buried point rule corresponding to the buried point identification information; acquiring processed buried data processed by at least one node of an application server of the tested application; and analyzing consistency of the buried point data and the processed buried point data corresponding to the same buried point identification information; the buried point rule comprises at least one of a triggering condition of a buried point, a format of buried point data and a value range of the buried point data;
Communication with a test server to establish a test task;
receiving a task identifier returned by the test server,
wherein the obtained buried point data is sent to the server in association with the task identifier, and the query request includes the task identifier.
22. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor causes the processor to perform the method of any of claims 1 to 19.
23. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1 to 19.
CN201910091351.0A 2019-01-30 2019-01-30 Test method, system, device, server and storage medium Active CN111506489B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910091351.0A CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910091351.0A CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Publications (2)

Publication Number Publication Date
CN111506489A CN111506489A (en) 2020-08-07
CN111506489B true CN111506489B (en) 2023-05-30

Family

ID=71877332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910091351.0A Active CN111506489B (en) 2019-01-30 2019-01-30 Test method, system, device, server and storage medium

Country Status (1)

Country Link
CN (1) CN111506489B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112181821A (en) * 2020-09-24 2021-01-05 浙江大搜车软件技术有限公司 Interface test coverage detection method and device, electronic device and storage medium
CN113448834A (en) * 2020-09-25 2021-09-28 北京新氧科技有限公司 Buried point testing method and device, electronic equipment and storage medium
CN112214407B (en) * 2020-10-10 2024-08-06 广州方硅信息技术有限公司 Data verification control method, data verification execution method, corresponding device, equipment and medium
CN113850987B (en) * 2020-12-11 2023-04-28 广东朝歌智慧互联科技有限公司 System for detecting product quality
CN112603292A (en) * 2020-12-22 2021-04-06 华南理工大学 Phase selection method of surface electromyographic signals for lower limb actions
CN112925725B (en) * 2021-04-09 2024-03-15 网易(杭州)网络有限公司 Data testing method and device, readable storage medium and electronic equipment
CN113254335B (en) * 2021-05-20 2024-04-16 北京达佳互联信息技术有限公司 Test data processing method and device, server and storage medium
CN113300912B (en) * 2021-05-21 2022-07-26 湖南快乐阳光互动娱乐传媒有限公司 Equipment testing method and device and electronic equipment
CN113645342B (en) * 2021-08-03 2022-05-17 杭银消费金融股份有限公司 Mobile phone mobile terminal testing device and method thereof
CN113642047A (en) * 2021-08-13 2021-11-12 上海哔哩哔哩科技有限公司 Buried point data verification method and system
CN113673997A (en) * 2021-08-27 2021-11-19 深圳鼎盛电脑科技有限公司 Visualized processing method, device, equipment and medium of fund calculation engine
CN114064504A (en) * 2021-11-25 2022-02-18 杭州网易云音乐科技有限公司 Detection method, device, medium and computing equipment for full link pressure measurement data isolation

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104348650A (en) * 2013-08-05 2015-02-11 腾讯科技(深圳)有限公司 Website monitoring method, business device and website monitoring system
CN106571949A (en) * 2016-09-23 2017-04-19 北京五八信息技术有限公司 Event tracking point processing method and apparatus
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN107133124A (en) * 2017-04-28 2017-09-05 努比亚技术有限公司 A kind of restorative procedure, data processing equipment and storage medium for not conforming to rule data
CN107196788A (en) * 2017-05-02 2017-09-22 阿里巴巴集团控股有限公司 A kind of processing method for burying point data, device, server and client
CN107688530A (en) * 2017-04-06 2018-02-13 平安科技(深圳)有限公司 Method for testing software and device
CN107832216A (en) * 2017-11-08 2018-03-23 无线生活(杭州)信息科技有限公司 One kind buries a method of testing and device
CN107870860A (en) * 2017-05-05 2018-04-03 平安科技(深圳)有限公司 Bury a checking system and method
CN108319552A (en) * 2018-02-07 2018-07-24 优信数享(北京)信息技术有限公司 One kind burying a test method, apparatus and system
CN108519862A (en) * 2018-03-30 2018-09-11 百度在线网络技术(北京)有限公司 Storage method, device, system and the storage medium of block catenary system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7003420B2 (en) * 2003-10-31 2006-02-21 International Business Machines Corporation Late binding of variables during test case generation for hardware and software design verification

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104348650A (en) * 2013-08-05 2015-02-11 腾讯科技(深圳)有限公司 Website monitoring method, business device and website monitoring system
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN106571949A (en) * 2016-09-23 2017-04-19 北京五八信息技术有限公司 Event tracking point processing method and apparatus
CN107688530A (en) * 2017-04-06 2018-02-13 平安科技(深圳)有限公司 Method for testing software and device
CN107133124A (en) * 2017-04-28 2017-09-05 努比亚技术有限公司 A kind of restorative procedure, data processing equipment and storage medium for not conforming to rule data
CN107196788A (en) * 2017-05-02 2017-09-22 阿里巴巴集团控股有限公司 A kind of processing method for burying point data, device, server and client
CN107870860A (en) * 2017-05-05 2018-04-03 平安科技(深圳)有限公司 Bury a checking system and method
CN107832216A (en) * 2017-11-08 2018-03-23 无线生活(杭州)信息科技有限公司 One kind buries a method of testing and device
CN108319552A (en) * 2018-02-07 2018-07-24 优信数享(北京)信息技术有限公司 One kind burying a test method, apparatus and system
CN108519862A (en) * 2018-03-30 2018-09-11 百度在线网络技术(北京)有限公司 Storage method, device, system and the storage medium of block catenary system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李旭 ; 翟颖琳 ; .基于数据分析平台的APP交互设计测试评估研究.包装工程.2018,(02),全文. *

Also Published As

Publication number Publication date
CN111506489A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
CN111506489B (en) Test method, system, device, server and storage medium
CN110046073B (en) Log collection method and device, equipment and storage medium
CN109995614B (en) Alpha testing method and device
CN110088744A (en) A kind of database maintenance method and its system
CN105068935B (en) Method and device for processing software test result
CN112163198B (en) Host login security detection method, system, device and storage medium
CN111651358A (en) Method for generating test case, software testing method, device and server
KR20150025106A (en) Verification apparatus, terminal device, system, method and computer-readable medium for monitoring of application verification result
CN111666193B (en) Method and system for monitoring and testing terminal function based on real-time log analysis
CN110990289A (en) Method and device for automatically submitting bug, electronic equipment and storage medium
CN107102938B (en) Test script updating method and device
CN116909800A (en) Method and device for locating crash information and storage medium
CN114860619B (en) Database audit program regression testing method and device
CN113538725B (en) Method for testing hardware products and related equipment
CN115373984A (en) Code coverage rate determining method and device
CN115757138A (en) Method and device for determining script abnormal reason, storage medium and electronic equipment
CN115168217A (en) Defect discovery method and device for source code file
CN107797915B (en) Fault repairing method, device and system
CN105701002A (en) Test based execution path recording method and apparatus
CN111061687B (en) Abnormal data positioning method, device and system
CN112506749B (en) On-site distinguishing method and system for error reporting information of hard disk
CN116633664B (en) Evaluation system for network security monitoring
CN116132121B (en) Feature recognition performance analysis method
CN116991724A (en) Interface testing method and device based on monitoring log, electronic equipment and storage medium
CN118152167A (en) Method and device for generating fault maintenance function, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201130

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

GR01 Patent grant
GR01 Patent grant