CN110908888A - Server testing method and device - Google Patents

Server testing method and device Download PDF

Info

Publication number
CN110908888A
CN110908888A CN201811080414.4A CN201811080414A CN110908888A CN 110908888 A CN110908888 A CN 110908888A CN 201811080414 A CN201811080414 A CN 201811080414A CN 110908888 A CN110908888 A CN 110908888A
Authority
CN
China
Prior art keywords
test
server
return value
query statement
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811080414.4A
Other languages
Chinese (zh)
Other versions
CN110908888B (en
Inventor
叶思
欧阳灿
熊伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811080414.4A priority Critical patent/CN110908888B/en
Publication of CN110908888A publication Critical patent/CN110908888A/en
Application granted granted Critical
Publication of CN110908888B publication Critical patent/CN110908888B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/0766Error or fault reporting or storing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/079Root cause analysis, i.e. error or fault diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides a server testing method and a device, wherein the method comprises the following steps: sending test data to at least one test device corresponding to a server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of all the test cases in the test case set and the target position information; sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification; when the target position information is consistent with the position information of the target position information, the corresponding error log is obtained so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically obtained for analysis, so that the test efficiency of the server is improved, and the test cost is reduced.

Description

Server testing method and device
Technical Field
The invention relates to the technical field of data processing, in particular to a server testing method and device.
Background
At present, the server testing process is that different input parameters of a server interface are manually carded, a test case is generated to test the interface based on the different input parameters, a test log is manually collected, the test log is verified, and whether the interface has a fault or not is judged. In addition, the SQL statement in the server program is checked manually to determine whether the SQL statement is wrong. However, the testing method is mainly completed manually, and has high labor cost, long testing time and low testing efficiency.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first objective of the present invention is to provide a server testing method, which is used for solving the problems of high testing cost and poor efficiency.
The second purpose of the invention is to provide a server testing device.
A third object of the present invention is to provide another server testing apparatus.
A fourth object of the invention is to propose a non-transitory computer-readable storage medium.
A fifth object of the invention is to propose a computer program product.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a server testing method, including:
sending test data to at least one test device corresponding to a server to be tested; the data for testing comprises: the method comprises the following steps that a test case set, verification specifications corresponding to return values of all test cases in the test case set and target position information are obtained;
sending a test instruction to the at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification;
and when the target position information is consistent with the position information of the target position information, acquiring a corresponding error log so as to analyze the error log.
Furthermore, in the check specification corresponding to the return value, a preset condition which is required to be met by each field in the return value is defined;
correspondingly, the error log is generated in a way that,
judging whether a field which does not meet corresponding preset conditions exists in the return value or not according to the return value of each test case;
if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
Further, the test case set includes: a test case subset for each interface of the server to be tested;
the test examples collectively include: and combining the values of the parameters of the interface to generate the test case.
Further, the method further comprises the following steps:
acquiring a query statement set corresponding to the server to be tested; the query statement set comprises: each query statement in the executive program of the server to be tested;
aiming at each query statement in the query statement set, acquiring a check specification corresponding to the query statement;
and comparing the query statement with the corresponding verification standard to obtain the query statement which does not meet the verification standard so as to analyze the query statement which does not meet the verification standard.
Further, before sending the data for testing to at least one testing device corresponding to the server to be tested, the method further includes:
acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: testing environment, sandbox environment and online environment;
correspondingly, the sending test data to at least one test device corresponding to the server to be tested includes:
and sending test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
Further, the data format of the return value is json format; the check specification corresponding to the return value is a schema check specification.
The server testing method of the embodiment of the invention sends testing data to at least one testing device corresponding to a server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of all the test cases in the test case set and the target position information; sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification; when the target position information is consistent with the position information of the target position information, the corresponding error log is obtained so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically obtained for analysis, so that the test efficiency of the server is improved, and the test cost is reduced.
In order to achieve the above object, a second embodiment of the present invention provides a server testing apparatus, including:
the deployment module is used for sending test data to at least one test device corresponding to the server to be tested; the data for testing comprises: the method comprises the following steps that a test case set, verification specifications corresponding to return values of all test cases in the test case set and target position information are obtained;
the test module is used for sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding verification specification;
and the log regression module is used for acquiring a corresponding error log when the target position information is consistent with the position information of the target position information, so as to analyze the error log.
Furthermore, in the check specification corresponding to the return value, a preset condition which is required to be met by each field in the return value is defined;
correspondingly, the error log is generated in a way that,
judging whether a field which does not meet corresponding preset conditions exists in the return value or not according to the return value of each test case;
if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
Further, the test case set includes: a test case subset for each interface of the server to be tested;
the test examples collectively include: and combining the values of the parameters of the interface to generate the test case.
Further, the device further comprises: the device comprises a first acquisition module and a comparison module;
the first acquisition module is used for acquiring the query statement set corresponding to the server to be tested; the query statement set comprises: each query statement in the executive program of the server to be tested;
the first obtaining module is further configured to obtain, for each query statement in the query statement set, a verification specification corresponding to the query statement;
and the comparison module is used for comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification so as to analyze the query statement which does not meet the check specification.
Further, the device further comprises: a second acquisition module;
the second obtaining module is used for obtaining the current environmental information of the server to be tested; the environment information is any one of the following environments: testing environment, sandbox environment and online environment;
correspondingly, the deployment module is specifically configured to send test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
Further, the data format of the return value is json format; the check specification corresponding to the return value is a schema check specification.
The server testing device of the embodiment of the invention sends the testing data to at least one testing device corresponding to the server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of all the test cases in the test case set and the target position information; sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification; when the target position information is consistent with the position information of the target position information, the corresponding error log is obtained so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically obtained for analysis, so that the test efficiency of the server is improved, and the test cost is reduced.
In order to achieve the above object, a third embodiment of the present invention provides another server testing apparatus, including: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements the server testing method as described above when executing the program.
In order to achieve the above object, a fourth aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the server testing method as described above.
In order to achieve the above object, a fifth embodiment of the present invention provides a computer program product, which when executed by an instruction processor in the computer program product, implements the server testing method as described above.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of a server testing method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of another server testing method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a server testing apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another server testing apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another server testing apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another server testing apparatus according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A server test method and apparatus according to an embodiment of the present invention will be described below with reference to the drawings.
Fig. 1 is a schematic flowchart of a server testing method according to an embodiment of the present invention. As shown in fig. 1, the server testing method includes the following steps:
s101, sending test data to at least one test device corresponding to a server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of the test cases in the test case set, and the destination position information.
The execution main body of the server testing method provided by the invention is the server testing device, and the server testing device can be hardware equipment such as terminal equipment, a server and an automatic testing platform, or software installed on the hardware equipment. In this embodiment, a server testing apparatus is taken as an example of an automated testing platform for description.
In this embodiment, at least one test device refers to a device that executes a test case to call a server interface to obtain an interface return value. The testing device may be a mobile terminal or a server. It should be noted that the automated testing platform may be a server cluster, the testing device may be integrated in the automated testing platform, and the automated testing platform deploys the test cases to the testing device and controls the testing device to perform testing.
In this embodiment, the test case set may include: the test case subset aims at all interfaces or part of interfaces of the server to be tested; examples of testing collectively include: and combining all values of all parameters of the interface to generate the test case. The automatic test platform can deploy a test case subset aiming at all interfaces of the server to be tested on each test device; or, the automated testing platform may divide the interfaces of the servers to be tested, and deploy the test case subsets of different interfaces to different testing devices.
In this embodiment, the data format of the return value of the test case may be a json format; the check specification corresponding to the returned value may be a schema check specification. The schema check specification is a means for specifying json format, and the schema check specification can be effectively specified when the number of interface return values is large and the type of each field in the interface return values is small. In addition, the check specification corresponding to the return value may also be other specifications, and may be set according to actual needs.
In this embodiment, the destination location information is location information of a destination device to be subjected to error log analysis. The location information of the destination device may be, for example, an identifier, an IP address, and the like of the destination device. The target equipment can be equipment used by specific testing personnel, and error logs obtained by testing on the testing equipment can be collected to the testing personnel through setting of target position information, so that the testing personnel can analyze the error logs, and the manual collection of the error logs by the testing personnel is avoided, so that the testing efficiency of the server is further improved, and the testing cost is further reduced.
In addition, it should be noted that, the test data sent by the automated test platform to each test device may not include the verification specification, and the destination location information is the destination location information of the automated test platform. Therefore, the test equipment can directly send the test result including the return value to the automatic test platform, the automatic test platform generates each test log according to each test result and the verification specification, and each test log is distributed to each tester for analysis.
S102, sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, obtaining a return value, and generating an error log according to the return value and a corresponding verification specification.
In this embodiment, a preset condition that each field in the return value needs to satisfy is defined in the check specification corresponding to the return value. Correspondingly, the error log is generated in a manner that whether a field which does not meet the corresponding preset condition exists in the return value is judged according to the return value of each test case; if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value; and generating an error log according to the error return value.
Wherein, the error log may include: and an error return value, and a corresponding interface and a test case. The error log may further include: an error field in the error return value, etc. In addition, if all the fields in the return value meet the corresponding preset conditions, the return value is determined to be the correct return value.
S103, when the target position information is consistent with the position information of the target position information, acquiring a corresponding error log so as to analyze the error log.
In this embodiment, the location information of the server is location information of the server testing apparatus. Taking a server test device as an automatic test platform as an example, the position information of the server test device is the position information of the automatic test platform. When the target position information is consistent with the position information of the automatic testing platform, the automatic testing platform is indicated as target equipment to be subjected to error log analysis, and therefore the automatic testing platform can obtain the corresponding error log and analyze the error log. When the target position information is inconsistent with the position information of the device, it is indicated that the device to be subjected to the error log analysis is not an automatic test platform but other devices, and the other devices can acquire the corresponding error logs and analyze the error logs.
In this embodiment, through setting of the destination location information, the error logs generated by each testing device may be returned to one or more testing personnel, so that the testing personnel can analyze the error logs, and the manual collection of the error logs by the testing personnel is avoided.
Further, on the basis of the foregoing embodiment, before step 101, the method may further include: acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: test environment, sandbox environment, and online environment. Correspondingly, step 101 may specifically be to send test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
The testing purpose of the server to be tested is different for different environments, such as a testing environment, a sandbox environment and an online environment, and further, the corresponding test cases are different, so that different testing data need to be deployed for at least one testing device corresponding to the server to be tested according to different environment information.
The server testing method of the embodiment of the invention sends testing data to at least one testing device corresponding to a server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of all the test cases in the test case set and the target position information; sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification; when the target position information is consistent with the position information of the target position information, the corresponding error log is obtained so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically obtained for analysis, so that the test efficiency of the server is improved, and the test cost is reduced.
Fig. 2 is a schematic flowchart of another server testing method according to an embodiment of the present invention. As shown in fig. 2, based on the embodiment shown in fig. 1, the method may further include the following steps:
s104, acquiring a query statement set corresponding to a server to be tested; the query statement set comprises: and each query statement in the executive program of the server to be tested.
The query statement set can be obtained by a tester from an execution program; or, obtaining a key field in the query statement in advance, then obtaining a search execution program according to the key field, obtaining a statement with the key field, and determining the statement with the key field as the query statement. The Query statement may be, for example, Structured Query Language (SQL).
And S105, aiming at each query statement in the query statement set, acquiring a verification specification corresponding to the query statement.
The check specification corresponding to the query statement defines conditions to be met by each field of the query statement.
S106, comparing the query statement with the corresponding verification specification to obtain the query statement which does not meet the verification specification so as to analyze the query statement which does not meet the verification specification.
In the embodiment, by acquiring the query statement set in the executive program of the server to be tested and judging whether each query statement in the query statement set meets the corresponding verification specification, the query statements which do not meet the corresponding verification specification can be detected and processed, so that potential safety hazards such as slow query can be solved, and the executive efficiency of the executive program of the server to be tested is improved.
Fig. 3 is a schematic structural diagram of a server testing apparatus according to an embodiment of the present invention. As shown in fig. 3, includes: a deployment module 31, a testing module 32 and a log regression module 33.
The deployment module 31 is configured to send test data to at least one test device corresponding to a server to be tested; the data for testing comprises: the method comprises the following steps that a test case set, verification specifications corresponding to return values of all test cases in the test case set and target position information are obtained;
the test module 32 is configured to send a test instruction to the at least one test device, so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding verification specification;
and a log regression module 33, configured to obtain a corresponding error log when the destination location information is consistent with the location information of the destination location information, so as to analyze the error log.
The server testing device provided by the invention can be hardware equipment such as terminal equipment, a server and an automatic testing platform, or software installed on the hardware equipment. In this embodiment, a server testing apparatus is taken as an example of an automated testing platform for description.
In this embodiment, at least one test device refers to a device that executes a test case to call a server interface to obtain an interface return value. The testing device may be a mobile terminal or a server. It should be noted that the automated testing platform may be a server cluster, the testing device may be integrated in the automated testing platform, and the automated testing platform deploys the test cases to the testing device and controls the testing device to perform testing.
In this embodiment, the test case set may include: the test case subset aims at all interfaces or part of interfaces of the server to be tested; examples of testing collectively include: and combining all values of all parameters of the interface to generate the test case. The automatic test platform can deploy a test case subset aiming at all interfaces of the server to be tested on each test device; or, the automated testing platform may divide the interfaces of the servers to be tested, and deploy the test case subsets of different interfaces to different testing devices.
In this embodiment, the data format of the return value of the test case may be a json format; the check specification corresponding to the returned value may be a schema check specification. The schema check specification is a means for specifying json format, and the schema check specification can be effectively specified when the number of interface return values is large and the type of each field in the interface return values is small. In addition, the check specification corresponding to the return value may also be other specifications, and may be set according to actual needs.
In this embodiment, the destination location information is location information of a destination device to be subjected to error log analysis. The location information of the destination device may be, for example, an identifier, an IP address, and the like of the destination device. The target equipment can be equipment used by specific testing personnel, and error logs obtained by testing on the testing equipment can be collected to the testing personnel through setting of target position information, so that the testing personnel can analyze the error logs, and the manual collection of the error logs by the testing personnel is avoided, so that the testing efficiency of the server is further improved, and the testing cost is further reduced.
In this embodiment, the location information of the server is location information of the server testing apparatus. Taking a server test device as an automatic test platform as an example, the position information of the server test device is the position information of the automatic test platform. When the target position information is consistent with the position information of the automatic testing platform, the automatic testing platform is indicated as target equipment to be subjected to error log analysis, and therefore the automatic testing platform can obtain the corresponding error log and analyze the error log. When the target position information is inconsistent with the position information of the device, it is indicated that the device to be subjected to the error log analysis is not an automatic test platform but other devices, and the other devices can acquire the corresponding error logs and analyze the error logs.
In addition, it should be noted that, the test data sent by the automated test platform to each test device may not include the verification specification, and the destination location information is the destination location information of the automated test platform. Therefore, the test equipment can directly send the test result including the return value to the automatic test platform, the automatic test platform generates each test log according to each test result and the verification specification, and each test log is distributed to each tester for analysis.
Further, on the basis of the above embodiment, in the check specification corresponding to the return value, a preset condition that each field in the return value needs to satisfy is defined. Correspondingly, the error log is generated in a manner that whether a field which does not meet the corresponding preset condition exists in the return value is judged according to the return value of each test case; if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value; and generating an error log according to the error return value.
Wherein, the error log may include: and an error return value, and a corresponding interface and a test case. The error log may further include: an error field in the error return value, etc. In addition, if all the fields in the return value meet the corresponding preset conditions, the return value is determined to be the correct return value.
Further, with reference to fig. 4, on the basis of the embodiment shown in fig. 3, the apparatus may further include: a second obtaining module 34, configured to obtain current environment information of the server to be tested; the environment information is any one of the following environments: testing environment, sandbox environment and online environment;
correspondingly, the deployment module 31 is specifically configured to send test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
The testing purpose of the server to be tested is different for different environments, such as a testing environment, a sandbox environment and an online environment, and further, the corresponding test cases are different, so that different testing data need to be deployed for at least one testing device corresponding to the server to be tested according to different environment information.
The server testing device of the embodiment of the invention sends the testing data to at least one testing device corresponding to the server to be tested; the test data includes: the test case set, the verification specifications corresponding to the return values of all the test cases in the test case set and the target position information; sending a test instruction to at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification; when the target position information is consistent with the position information of the target position information, the corresponding error log is obtained so as to analyze the error log, so that test data can be automatically deployed on each test device, and the error log is automatically obtained for analysis, so that the test efficiency of the server is improved, and the test cost is reduced.
Further, with reference to fig. 5, on the basis of the embodiment shown in fig. 3, the apparatus may further include: a first acquisition module 35 and a comparison module 36.
The first obtaining module 35 is configured to obtain a query statement set corresponding to the server to be tested; the query statement set comprises: each query statement in the executive program of the server to be tested;
the first obtaining module 35 is further configured to obtain, for each query statement in the query statement set, a check specification corresponding to the query statement;
the comparison module 36 is configured to compare the query statement with the corresponding verification specification, and obtain a query statement that does not meet the verification specification, so as to analyze the query statement that does not meet the verification specification.
The query statement set can be obtained by a tester from an execution program; or, obtaining a key field in the query statement in advance, then obtaining a search execution program according to the key field, obtaining a statement with the key field, and determining the statement with the key field as the query statement. The Query statement may be, for example, Structured Query Language (SQL).
In the embodiment, by acquiring the query statement set in the executive program of the server to be tested and judging whether each query statement in the query statement set meets the corresponding verification specification, the query statements which do not meet the corresponding verification specification can be detected and processed, so that potential safety hazards such as slow query can be solved, and the executive efficiency of the executive program of the server to be tested is improved.
Fig. 6 is a schematic structural diagram of another server testing apparatus according to an embodiment of the present invention. The server testing device comprises:
memory 1001, processor 1002, and computer programs stored on memory 1001 and executable on processor 1002.
The processor 1002, when executing the program, implements the server testing method provided in the above-described embodiments.
Further, the server testing device further comprises:
a communication interface 1003 for communicating between the memory 1001 and the processor 1002.
A memory 1001 for storing computer programs that may be run on the processor 1002.
Memory 1001 may include high-speed RAM memory and may also include non-volatile memory (e.g., at least one disk memory).
The processor 1002 is configured to implement the server testing method according to the foregoing embodiment when executing the program.
If the memory 1001, the processor 1002, and the communication interface 1003 are implemented independently, the communication interface 1003, the memory 1001, and the processor 1002 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 6, but this is not intended to represent only one bus or type of bus.
Optionally, in a specific implementation, if the memory 1001, the processor 1002, and the communication interface 1003 are integrated on one chip, the memory 1001, the processor 1002, and the communication interface 1003 may complete communication with each other through an internal interface.
The processor 1002 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention.
The invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a server testing method as described above.
The invention also provides a computer program product, which when executed by an instruction processor in the computer program product implements the server testing method as described above.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (15)

1. A server testing method is characterized by comprising the following steps:
sending test data to at least one test device corresponding to a server to be tested; the data for testing comprises: the method comprises the following steps that a test case set, verification specifications corresponding to return values of all test cases in the test case set and target position information are obtained;
sending a test instruction to the at least one test device to enable the at least one test device to test each interface of the server to be tested according to the test case set, acquiring a return value, and generating an error log according to the return value and a corresponding verification specification;
and when the target position information is consistent with the position information of the target position information, acquiring a corresponding error log so as to analyze the error log.
2. The method according to claim 1, wherein the check specification corresponding to the return value defines preset conditions that each field in the return value needs to satisfy;
correspondingly, the error log is generated in a way that,
judging whether a field which does not meet corresponding preset conditions exists in the return value or not according to the return value of each test case;
if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
3. The method of claim 1, wherein the set of test cases comprises: a test case subset for each interface of the server to be tested;
the test examples collectively include: and combining the values of the parameters of the interface to generate the test case.
4. The method of claim 1, further comprising:
acquiring a query statement set corresponding to the server to be tested; the query statement set comprises: each query statement in the executive program of the server to be tested;
aiming at each query statement in the query statement set, acquiring a check specification corresponding to the query statement;
and comparing the query statement with the corresponding verification standard to obtain the query statement which does not meet the verification standard so as to analyze the query statement which does not meet the verification standard.
5. The method of claim 1, wherein before sending the test data to the at least one test device corresponding to the server under test, the method further comprises:
acquiring current environmental information of the server to be tested; the environment information is any one of the following environments: testing environment, sandbox environment and online environment;
correspondingly, the sending test data to at least one test device corresponding to the server to be tested includes:
and sending test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
6. The method of claim 1, wherein the data format of the return value is json format; the check specification corresponding to the return value is a schema check specification.
7. A server testing apparatus, comprising:
the deployment module is used for sending test data to at least one test device corresponding to the server to be tested; the data for testing comprises: the method comprises the following steps that a test case set, verification specifications corresponding to return values of all test cases in the test case set and target position information are obtained;
the test module is used for sending a test instruction to the at least one test device so that the at least one test device tests each interface of the server to be tested according to the test case set, obtains a return value, and generates an error log according to the return value and a corresponding verification specification;
and the log regression module is used for acquiring a corresponding error log when the target position information is consistent with the position information of the target position information, so as to analyze the error log.
8. The device according to claim 7, wherein a preset condition that each field in the return value needs to satisfy is defined in the check specification corresponding to the return value;
correspondingly, the error log is generated in a way that,
judging whether a field which does not meet corresponding preset conditions exists in the return value or not according to the return value of each test case;
if the fields which do not meet the corresponding preset conditions exist in the return value, determining the return value as an error return value;
and generating an error log according to the error return value.
9. The apparatus of claim 7, wherein the set of test cases comprises: a test case subset for each interface of the server to be tested;
the test examples collectively include: and combining the values of the parameters of the interface to generate the test case.
10. The apparatus of claim 7, further comprising: the device comprises a first acquisition module and a comparison module;
the first acquisition module is used for acquiring the query statement set corresponding to the server to be tested; the query statement set comprises: each query statement in the executive program of the server to be tested;
the first obtaining module is further configured to obtain, for each query statement in the query statement set, a verification specification corresponding to the query statement;
and the comparison module is used for comparing the query statement with the corresponding check specification to obtain the query statement which does not meet the check specification so as to analyze the query statement which does not meet the check specification.
11. The apparatus of claim 7, further comprising: a second acquisition module;
the second obtaining module is used for obtaining the current environmental information of the server to be tested; the environment information is any one of the following environments: testing environment, sandbox environment and online environment;
correspondingly, the deployment module is specifically configured to send test data corresponding to the current environment information to at least one test device corresponding to the server to be tested.
12. The apparatus of claim 7, wherein the data format of the return value is json format; the check specification corresponding to the return value is a schema check specification.
13. A server testing apparatus, comprising:
memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements the server testing method according to any of claims 1-6 when executing the program.
14. A non-transitory computer-readable storage medium having stored thereon a computer program, wherein the program, when executed by a processor, implements the server testing method of any one of claims 1-6.
15. A computer program product implementing the server testing method of any one of claims 1-6 when executed by an instruction processor in the computer program product.
CN201811080414.4A 2018-09-17 2018-09-17 Server testing method and device Active CN110908888B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811080414.4A CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811080414.4A CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Publications (2)

Publication Number Publication Date
CN110908888A true CN110908888A (en) 2020-03-24
CN110908888B CN110908888B (en) 2023-06-30

Family

ID=69812621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811080414.4A Active CN110908888B (en) 2018-09-17 2018-09-17 Server testing method and device

Country Status (1)

Country Link
CN (1) CN110908888B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475409A (en) * 2020-03-30 2020-07-31 深圳追一科技有限公司 System test method, device, electronic equipment and storage medium
CN111679989A (en) * 2020-06-16 2020-09-18 贝壳技术有限公司 Interface robustness testing method and device, electronic equipment and storage medium
CN111694734A (en) * 2020-05-26 2020-09-22 五八有限公司 Software interface checking method and device and computer equipment
CN111885051A (en) * 2020-07-22 2020-11-03 微医云(杭州)控股有限公司 Data verification method and device and electronic equipment
CN113904954A (en) * 2021-08-27 2022-01-07 深圳市有方科技股份有限公司 System for testing wireless communication module
CN114281613A (en) * 2021-11-19 2022-04-05 苏州浪潮智能科技有限公司 Server testing method and device, computer equipment and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6044398A (en) * 1997-11-21 2000-03-28 International Business Machines Corporation Virtual dynamic browsing system and method for automated web server and testing
CN104111885A (en) * 2013-04-22 2014-10-22 腾讯科技(深圳)有限公司 Method and device for verifying interface test results
CN104360920A (en) * 2014-12-02 2015-02-18 微梦创科网络科技(中国)有限公司 Automatic testing method and device for interface
CN104468275A (en) * 2014-12-18 2015-03-25 辽宁生产力促进中心 Industrial cluster creative platform testing device and method
US20150370691A1 (en) * 2014-06-18 2015-12-24 EINFOCHIPS Inc System testing of software programs executing on modular frameworks
CN105373469A (en) * 2014-08-25 2016-03-02 广东金赋信息科技有限公司 Interface based software automation test method
CN105550113A (en) * 2015-12-18 2016-05-04 网易(杭州)网络有限公司 Web test method and test machine
CN106095673A (en) * 2016-06-07 2016-11-09 深圳市泰久信息系统股份有限公司 Automated testing method based on WEB interface and system
CN106776307A (en) * 2016-12-05 2017-05-31 广州唯品会信息科技有限公司 Method for testing software and system
CN106815138A (en) * 2015-12-01 2017-06-09 北京奇虎科技有限公司 A kind of method and apparatus for generating interface testing use-case
CN106874192A (en) * 2017-01-03 2017-06-20 中国科学院自动化研究所 The method of testing and test system of a kind of standard compliance towards digital publishing
CN107193681A (en) * 2016-03-15 2017-09-22 阿里巴巴集团控股有限公司 Data verification method and device
CN107294808A (en) * 2017-07-05 2017-10-24 网易(杭州)网络有限公司 The methods, devices and systems of interface testing
CN107643981A (en) * 2017-08-29 2018-01-30 顺丰科技有限公司 A kind of automatic test platform and operation method of polynary operation flow
CN107729243A (en) * 2017-10-12 2018-02-23 上海携程金融信息服务有限公司 API automated testing method, system, equipment and storage medium
CN107741911A (en) * 2017-11-01 2018-02-27 广州爱九游信息技术有限公司 Interface test method, device, client and computer-readable recording medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6044398A (en) * 1997-11-21 2000-03-28 International Business Machines Corporation Virtual dynamic browsing system and method for automated web server and testing
CN104111885A (en) * 2013-04-22 2014-10-22 腾讯科技(深圳)有限公司 Method and device for verifying interface test results
US20150370691A1 (en) * 2014-06-18 2015-12-24 EINFOCHIPS Inc System testing of software programs executing on modular frameworks
CN105373469A (en) * 2014-08-25 2016-03-02 广东金赋信息科技有限公司 Interface based software automation test method
CN104360920A (en) * 2014-12-02 2015-02-18 微梦创科网络科技(中国)有限公司 Automatic testing method and device for interface
CN104468275A (en) * 2014-12-18 2015-03-25 辽宁生产力促进中心 Industrial cluster creative platform testing device and method
CN106815138A (en) * 2015-12-01 2017-06-09 北京奇虎科技有限公司 A kind of method and apparatus for generating interface testing use-case
CN105550113A (en) * 2015-12-18 2016-05-04 网易(杭州)网络有限公司 Web test method and test machine
CN107193681A (en) * 2016-03-15 2017-09-22 阿里巴巴集团控股有限公司 Data verification method and device
CN106095673A (en) * 2016-06-07 2016-11-09 深圳市泰久信息系统股份有限公司 Automated testing method based on WEB interface and system
CN106776307A (en) * 2016-12-05 2017-05-31 广州唯品会信息科技有限公司 Method for testing software and system
CN106874192A (en) * 2017-01-03 2017-06-20 中国科学院自动化研究所 The method of testing and test system of a kind of standard compliance towards digital publishing
CN107294808A (en) * 2017-07-05 2017-10-24 网易(杭州)网络有限公司 The methods, devices and systems of interface testing
CN107643981A (en) * 2017-08-29 2018-01-30 顺丰科技有限公司 A kind of automatic test platform and operation method of polynary operation flow
CN107729243A (en) * 2017-10-12 2018-02-23 上海携程金融信息服务有限公司 API automated testing method, system, equipment and storage medium
CN107741911A (en) * 2017-11-01 2018-02-27 广州爱九游信息技术有限公司 Interface test method, device, client and computer-readable recording medium

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
JINGFAN TANG等: "Towards adaptive framework of keyword driven automation testing", 《2008 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND LOGISTICS》 *
JINGFAN TANG等: "Towards adaptive framework of keyword driven automation testing", 《2008 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND LOGISTICS》, 30 September 2008 (2008-09-30), pages 1631 - 1636, XP031329907 *
张江: "某种软件API自动化测试工具的设计与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
张江: "某种软件API自动化测试工具的设计与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 15 February 2012 (2012-02-15), pages 138 - 1331 *
明德祥等: "测试服务器技术与研究", 《计算机测量与控制》 *
明德祥等: "测试服务器技术与研究", 《计算机测量与控制》, vol. 10, no. 11, 31 December 2004 (2004-12-31), pages 710 - 713 *
苗东亮等: "一种多接口测试系统设计", 《电子世界》 *
苗东亮等: "一种多接口测试系统设计", 《电子世界》, no. 24, 16 January 2018 (2018-01-16), pages 126 - 127 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475409A (en) * 2020-03-30 2020-07-31 深圳追一科技有限公司 System test method, device, electronic equipment and storage medium
CN111694734A (en) * 2020-05-26 2020-09-22 五八有限公司 Software interface checking method and device and computer equipment
CN111679989A (en) * 2020-06-16 2020-09-18 贝壳技术有限公司 Interface robustness testing method and device, electronic equipment and storage medium
CN111885051A (en) * 2020-07-22 2020-11-03 微医云(杭州)控股有限公司 Data verification method and device and electronic equipment
CN111885051B (en) * 2020-07-22 2022-10-25 微医云(杭州)控股有限公司 Data verification method and device and electronic equipment
CN113904954A (en) * 2021-08-27 2022-01-07 深圳市有方科技股份有限公司 System for testing wireless communication module
CN113904954B (en) * 2021-08-27 2023-09-01 深圳市有方科技股份有限公司 System for testing wireless communication module
CN114281613A (en) * 2021-11-19 2022-04-05 苏州浪潮智能科技有限公司 Server testing method and device, computer equipment and storage medium
CN114281613B (en) * 2021-11-19 2024-01-09 苏州浪潮智能科技有限公司 Server testing method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN110908888B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN110908888B (en) Server testing method and device
CN108563214B (en) Vehicle diagnosis method, device and equipment
CN112906008A (en) Kernel vulnerability repairing method, device, server and system
CN108009085B (en) Channel package testing method
CN111693089A (en) Product quality control method, device, equipment and storage medium for assembly line
CN110888804B (en) Interface test method and interface test platform
CN105468507B (en) Branch standard reaching detection method and device
CN109669436B (en) Test case generation method and device based on functional requirements of electric automobile
CN111309602A (en) Software testing method, device and system
CN107992420B (en) Management method and system for test item
CN109947715B (en) Log alarm method and device
CN110008074B (en) Method, device and equipment for automatically testing and inquiring upper-layer interface of hardware information
CN113535538A (en) Application full-link automatic testing method and device, electronic equipment and storage medium
CN116257437A (en) ADAS system defect verification method and device based on real vehicle data reinjection
CN110971478A (en) Pressure measurement method and device for cloud platform service performance and computing equipment
CN115686608A (en) Software version management method and device for vehicle, server and storage medium
CN110442370B (en) Test case query method and device
CN114326677A (en) Vehicle machine testing method and device
CN112506760A (en) Vehicle controller software flash test method, system, device and storage medium
CN111984527A (en) Software performance testing method, device, equipment and medium
CN111198774A (en) Unmanned vehicle simulation abnormity tracking method, device, equipment and computer readable medium
CN106992873B (en) Protection group processing method and device
CN107102938B (en) Test script updating method and device
CN116957424B (en) Product testing method, device, electronic equipment and readable storage medium
CN113821431A (en) Method and device for acquiring test result, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant