CN109669867B - Test apparatus, automated test method, and computer-readable storage medium - Google Patents

Test apparatus, automated test method, and computer-readable storage medium Download PDF

Info

Publication number
CN109669867B
CN109669867B CN201811513480.6A CN201811513480A CN109669867B CN 109669867 B CN109669867 B CN 109669867B CN 201811513480 A CN201811513480 A CN 201811513480A CN 109669867 B CN109669867 B CN 109669867B
Authority
CN
China
Prior art keywords
server
result
test program
connection
running
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811513480.6A
Other languages
Chinese (zh)
Other versions
CN109669867A (en
Inventor
贾茜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201811513480.6A priority Critical patent/CN109669867B/en
Publication of CN109669867A publication Critical patent/CN109669867A/en
Application granted granted Critical
Publication of CN109669867B publication Critical patent/CN109669867B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention relates to a function test technology, and discloses a test device, an automatic test method and a computer readable storage medium. The method comprises the steps of running a predetermined server connection test program, obtaining and analyzing the running result of the server connection test program, and sending out prompt information when the running result does not meet a first preset condition; the method comprises the steps of regularly operating a predetermined application test program, obtaining and analyzing an operation result of the application test program, and sending out prompt information when the operation result does not meet a second preset condition; and running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result does not meet a third preset condition. Compared with the prior art, the invention realizes the automatic test of each server and database in the system.

Description

Test apparatus, automated test method, and computer-readable storage medium
Technical Field
The present invention relates to the field of computer and network technologies, and in particular, to a testing apparatus, an automated testing method, and a computer readable storage medium.
Background
Currently, with the continuous expansion of informatization construction in terms of strength, breadth, depth and frequency, each enterprise or government needs to provide business services as a whole for public services of full life cycle and full business flow to society, so a large number of public platform systems are presented.
Because of the multiple functions that are integrated, a common platform class system typically includes multiple devices (e.g., multiple servers and databases) that are complex to deploy, and the operating condition of each device may have a large impact on the operating condition of the overall system. However, at present, a method for automatically monitoring or testing the operation states of various devices in a system is still lacking, and maintenance personnel cannot discover system errors at the first time.
Therefore, how to implement an automated test on a system is a problem to be solved.
Disclosure of Invention
The main object of the present invention is to provide a testing device, an automated testing method and a computer readable storage medium, aiming at realizing the automated testing of a system.
In order to achieve the above object, the present invention provides a testing device, which includes a memory and a processor, wherein an automated testing program is stored in the memory, and the automated testing program when executed by the processor implements the following steps:
a first test step: running a predetermined server connection test program, acquiring and analyzing the running result of the server connection test program, and sending out prompt information when the running result of the server connection test program does not meet a first preset condition;
and a second testing step: after communication connection is established with a server, a predetermined application test program is operated at regular time, an operation result of the application test program is obtained and analyzed, and when the operation result of the application test program does not meet a second preset condition, prompt information is sent out;
and a third testing step: and establishing communication connection with a database, running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result of the database running test program does not meet a third preset condition.
Preferably, the processor executes the automated test program, and further implements the steps of:
detecting whether newly added server log information exists in a server in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists in the server;
determining an error type according to the extracted key field;
inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts;
and running all the queried preset scripts.
Preferably, the first testing step includes:
the simulation client sends a plurality of communication connection requests to the server, and obtains connection results of the communication connection requests, wherein the connection results comprise connection success and connection failure;
calculating a communication connection success rate according to the connection result, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating a communication connection failure rate, and taking the communication connection failure rate as an operation result of the server connection test program;
judging whether the operation result of the server connection test program meets a first preset condition or not, if not, sending out prompt information, wherein the first preset condition is that the communication connection success rate is larger than a first preset threshold value when the operation result of the server connection test program is the communication connection success rate, and the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value when the operation result of the server connection test program is the communication connection failure rate.
Preferably, the second testing step includes:
after communication connection is established with the server, the simulation client calls a login interface of the server and sends a user login request carrying login information to the server;
receiving a login result returned by a server, wherein the login result comprises login success and login failure;
and when the login result is login failure, sending out prompt information.
Preferably, the third testing step includes:
establishing communication connection with a database, calling a query interface of the database, and sending a query request to the database;
receiving a query result returned by the database;
judging whether the query result is the same as a preset result;
and sending out prompt information when the query result is different from the preset result.
In addition, in order to achieve the above object, the present invention also provides an automated testing method, which is suitable for a testing device, and the method includes:
a first test step: running a predetermined server connection test program, acquiring and analyzing the running result of the server connection test program, and sending out prompt information when the running result of the server connection test program does not meet a first preset condition;
and a second testing step: after communication connection is established with a server, a predetermined application test program is operated at regular time, an operation result of the application test program is obtained and analyzed, and when the operation result of the application test program does not meet a second preset condition, prompt information is sent out;
and a third testing step: and establishing communication connection with a database, running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result of the database running test program does not meet a third preset condition.
Preferably, the method further comprises:
detecting whether newly added server log information exists in a server in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists in the server;
determining an error type according to the extracted key field;
inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts;
and running all the queried preset scripts.
Preferably, the first testing step includes:
the simulation client sends a plurality of communication connection requests to the server, and obtains connection results of the communication connection requests, wherein the connection results comprise connection success and connection failure;
calculating a communication connection success rate according to the connection result, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating a communication connection failure rate, and taking the communication connection failure rate as an operation result of the server connection test program;
judging whether the operation result of the server connection test program meets a first preset condition or not, if not, sending out prompt information, wherein the first preset condition is that the communication connection success rate is larger than a first preset threshold value when the operation result of the server connection test program is the communication connection success rate, and the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value when the operation result of the server connection test program is the communication connection failure rate.
Preferably, the second testing step includes:
after communication connection is established with the server, the simulation client calls a login interface of the server and sends a user login request carrying login information to the server;
receiving a login result returned by a server, wherein the login result comprises login success and login failure;
when the login result is login failure, sending out prompt information;
the third testing step includes:
establishing communication connection with a database, calling a query interface of the database, and sending a query request to the database;
receiving a query result returned by the database;
judging whether the query result is the same as a preset result;
and sending out prompt information when the query result is different from the preset result.
Furthermore, to achieve the above object, the present invention also proposes a computer-readable storage medium storing an automated test program executable by at least one processor to cause the at least one processor to perform the steps of the automated test method according to any one of the above.
The method comprises the steps of running a predetermined server connection test program, obtaining and analyzing the running result of the server connection test program, and sending out prompt information when the running result does not meet a first preset condition; the method comprises the steps of regularly operating a predetermined application test program, obtaining and analyzing an operation result of the application test program, and sending out prompt information when the operation result does not meet a second preset condition; and running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result does not meet a third preset condition. Compared with the prior art, the method and the system realize the automatic test of each server and database in the system by running the server connection test program, the application test program and the database running test program in real time or at fixed time. In addition, when the running result of the test program does not meet the preset condition, the invention sends out prompt information, which is helpful for system maintenance personnel to discover and process system errors in time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of an alternative application environment according to various embodiments of the present invention;
FIG. 2 is a schematic diagram of an embodiment of an automated test program according to the present invention;
FIG. 3 is a block diagram of an automated test procedure according to one embodiment of the present invention;
FIG. 4 is a flow chart of an embodiment of an automated testing method of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The principles and features of the present invention are described below with reference to the drawings, the examples are illustrated for the purpose of illustrating the invention and are not to be construed as limiting the scope of the invention.
Referring to fig. 1, an alternative application environment is shown according to various embodiments of the present invention.
In the present embodiment, the present invention is applicable to application environments including, but not limited to, the test apparatus 1, the server 2, and the database 3. The test device 1, the server 2 and the database 3 can be connected through network communication. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a global system for mobile communications (Global System of Mobile communication, GSM), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), a 4G network, a 5G network, bluetooth (Bluetooth), wi-Fi, etc.
In this application environment, the number of the test devices 1, the servers 2 and the databases 3 may be one or more, which is not limited in this embodiment.
In the following, various embodiments of the present invention will be presented based on the above-described application environment and related devices.
The invention provides an automatic test program.
Referring to FIG. 2, a schematic diagram of an operating environment of an embodiment of an automated test program 10 according to the present invention is shown.
In the present embodiment, the automated test program 10 is installed and run in the test apparatus 1. The test device 1 may be a computing device such as a desktop computer, a notebook, a palm computer, etc. The test device 1 may include, but is not limited to, a memory 11, a processor 12, and a display 13. Fig. 1 shows only a test device 1 with components 11-13, but it is understood that not all shown components are required to be implemented, and that more or fewer components may alternatively be implemented.
The memory 11 may in some embodiments be an internal storage unit of the testing device 1, such as a hard disk or a memory of the testing device 1. The memory 11 may in other embodiments also be an external storage device of the test apparatus 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the test apparatus 1. Further, the memory 11 may also comprise both an internal memory unit and an external memory device of the test apparatus 1. The memory 11 is used for storing application software installed in the test device 1 and various data, such as program codes of the automated test program 10. The memory 11 may also be used to temporarily store data that has been output or is to be output.
The processor 12 may in some embodiments be a central processing unit (Central Processing Unit, CPU), microprocessor or other data processing chip for running program code or processing data stored in the memory 11, such as executing the automated test program 10, etc.
The display 13 may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch, or the like in some embodiments. The display 13 is used for displaying information processed in the test device 1 and for displaying a visual user interface. The components 11-13 of the test device 1 communicate with each other via a program bus.
Referring to FIG. 3, a block diagram of an embodiment of an automated test program 10 according to the present invention is shown. In this embodiment, the automated test program 10 may be divided into one or more modules, and one or more modules are stored in the memory 11 and executed by one or more processors (in this embodiment, the processor 12) to complete the present invention. For example, in fig. 3, automated test program 10 may be partitioned into a first test module 101, a second test module 102, and a third test module 103. The modules referred to in the present invention are a series of computer program instruction segments capable of performing a specific function, more suitable than a program for describing the execution of an automated test program 10 in a test apparatus 1, wherein:
the first test module 101 is configured to run a predetermined server connection test program, obtain and parse an operation result of the server connection test program, and send out a prompt message when the operation result of the server connection test program does not meet a first preset condition.
For example, the first test module 101 simulates a client sending a plurality of communication connection requests to the server 2, and obtains connection results (including connection success and connection failure) of the respective communication connection requests.
Then, according to the connection result, calculating a communication connection success rate, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating the communication connection failure rate, and taking the communication connection failure rate as the running result of the server connection test program.
Wherein, the step of calculating the success rate of communication connection comprises the following steps:
and counting the number of connection results of all connection success, counting the total number of the connection results, and calculating the ratio of the number of the connection results of the connection success to the total number of the connection results, wherein the ratio is the success rate of communication connection.
The step of calculating the communication connection failure rate includes:
and counting the number of connection results of all connection failures, counting the total number of the connection results, and calculating the ratio of the number of the connection results of the connection failures to the total number of the connection results, wherein the ratio is the communication connection failure rate.
And finally, judging whether the running result of the server connection test program meets a first preset condition, if not, determining that the test fails, sending out prompt information, and if so, determining that the test is successful.
When the running result of the server connection test program is the communication connection success rate, the first preset condition is that the communication connection success rate is larger than a first preset threshold value, and when the running result of the server connection test program is the communication connection failure rate, the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value.
The mode for sending out the prompt message comprises the following steps: email, sms, instant messaging (instant messaging includes text instant messaging and voice instant messaging), and the like.
And the second test module 102 is configured to operate a predetermined application test program at regular time after communication connection is established with the server 2, obtain and parse an operation result of the application test program, and send out a prompt message when the operation result of the application test program does not meet a second preset condition.
In this embodiment, the application test program is used to test the application program of the server 2.
For example, after establishing a communication connection with the server 2, the second test module 102 simulates a client invoking a login interface of the server 2 and sends a user login request carrying login information to the server 2.
Then, the login result returned by the server 2 is received. The login result comprises login success and login failure. When the login result is login failure, test failure is determined, and prompt information (for example, the prompt information is sent in a mode of e-mail, short message, instant message and the like) is sent. And when the login result is login success, determining that the test is successful.
In this embodiment, the second testing module 102 may also simulate that the client calls other types of interfaces to test other functions of the server 2.
And the third test module 103 is configured to establish a communication connection with the database 3, run a predetermined database running test program, obtain and parse a running result of the database running test program, and send out a prompt message when the running result of the application test program does not meet a third preset condition.
For example, the third test module 103 establishes a communication connection with the database 3, invokes a query interface of the database 3, and sends a query request (which may carry a query condition) to the database 3.
Then, the query result returned by the database 3 is received.
And then judging whether the query result is the same as a preset result.
When the query result is different from the preset result, determining that the test fails, and sending out prompt information (for example, sending in a mode of e-mail, short message, instant message and the like). And when the query result is the same as the preset result, determining that the test is successful.
Compared with the prior art, the invention realizes the automatic test of each server 2 and database 3 in the system by running the server connection test program, the application test program and the database running test program in real time or at regular time. In addition, when the running result of the test program does not meet the preset condition, the invention sends out prompt information, which is helpful for system maintenance personnel to discover and process system errors in time.
Further, in this embodiment, the automated test program 10 further includes a detection module, a determination module, and an operation module (not shown in the drawings), wherein:
and the detection module is used for detecting whether the newly added server log information exists in the server 2 in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists.
The server log information includes access records, error records (for recording error information at runtime), and the like. The error records in the server log information may record not only the internal run-time error information of the server 2, but also the running errors of the device or apparatus (e.g., database 3) interacting with the server 2.
And the determining module is used for determining the error type according to the extracted key field.
The manner of determining the error type at least comprises the following two schemes.
Scheme one:
and determining the error type corresponding to the extracted key field according to the mapping relation between the predetermined key field and the error type.
The mapping relationship between the key field and the error type can be shown in table one.
Table one:
the above examples are only used to aid understanding, and the user may set keywords and the mapping relationship between keywords and error types according to a specific application scenario.
Scheme II:
and judging whether a second preset key field (for example, "start") is extracted within a preset time period (for example, ten minutes) after the first preset key field (for example, "force shutdown successfully") is extracted, and if not, determining that the error type is abnormal STARTING of the server application program.
And the operation module is used for inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts.
The mapping relation between the error type and the preset script can be shown in the second table.
And (II) table:
error type Preset script
Data anomalies Without any means for
Disk space is insufficient clearWlsStg.sh script
Database anomalies Database restart script
Server application launch exception reset. Sh script
The clearWlsStg.sh script is used to clean up disk space.
The database restart script is used to restart the database 3.
The reset.sh script is used for restarting the application program of the server 2, after restarting the server 2, inquiring a preset third key field (for example, server state changed to STARTING) from the updated server log information, and if so, feeding back the normal information of the application program of the server 2.
And step S70, running all the queried preset scripts.
According to the embodiment, the error type (namely the fault type) can be automatically identified by detecting the server log information, and the automatic repair of partial errors is realized by running the preset script, so that the consumption of human resources is reduced, and the availability and the stability of the system are improved.
In addition, the invention provides an automatic test method which is suitable for a test device.
Fig. 4 is a flow chart of an embodiment of the automated testing method of the present invention.
In this embodiment, the method includes:
step S10, running a predetermined server connection test program, acquiring and analyzing the running result of the server connection test program, and sending out prompt information when the running result of the server connection test program does not meet a first preset condition.
For example, the test apparatus 1 simulates a client transmitting a plurality of communication connection requests to the server 2, and acquires connection results (the connection results include connection success and connection failure) of the respective communication connection requests.
Then, according to the connection result, calculating a communication connection success rate, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating the communication connection failure rate, and taking the communication connection failure rate as the running result of the server connection test program.
Wherein, the step of calculating the success rate of communication connection comprises the following steps:
and counting the number of connection results of all connection success, counting the total number of the connection results, and calculating the ratio of the number of the connection results of the connection success to the total number of the connection results, wherein the ratio is the success rate of communication connection.
The step of calculating the communication connection failure rate includes:
and counting the number of connection results of all connection failures, counting the total number of the connection results, and calculating the ratio of the number of the connection results of the connection failures to the total number of the connection results, wherein the ratio is the communication connection failure rate.
And finally, judging whether the running result of the server connection test program meets a first preset condition, if not, determining that the test fails, sending out prompt information, and if so, determining that the test is successful.
When the running result of the server connection test program is the communication connection success rate, the first preset condition is that the communication connection success rate is larger than a first preset threshold value, and when the running result of the server connection test program is the communication connection failure rate, the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value.
The mode for sending out the prompt message comprises the following steps: email, sms, instant messaging (instant messaging includes text instant messaging and voice instant messaging), and the like.
Step S20, after communication connection is established with the server 2, a predetermined application test program is operated at regular time, the operation result of the application test program is obtained and analyzed, and when the operation result of the application test program does not meet a second preset condition, prompt information is sent out.
In this embodiment, the application test program is used to test the application program of the server 2.
For example, after establishing a communication connection with the server 2, the test apparatus 1 simulates a client calling a login interface of the server 2 and transmits a user login request carrying login information to the server 2.
Then, the login result returned by the server 2 is received. The login result comprises login success and login failure. When the login result is login failure, test failure is determined, and prompt information (for example, the prompt information is sent in a mode of e-mail, short message, instant message and the like) is sent. And when the login result is login success, determining that the test is successful.
In this embodiment, the testing apparatus 1 may also simulate that the client calls other types of interfaces to test other functions of the server 2.
Step S30, communication connection is established with the database 3, a predetermined database operation test program is operated, the operation result of the database operation test program is obtained and analyzed, and when the operation result of the database operation test program does not meet a third preset condition, prompt information is sent out.
For example, the test device 1 establishes a communication connection with the database 3, invokes a query interface of the database 3, and sends a query request (which may carry a query condition) to the database 3.
Then, the query result returned by the database 3 is received.
And then judging whether the query result is the same as a preset result.
When the query result is different from the preset result, determining that the test fails, and sending out prompt information (for example, sending in a mode of e-mail, short message, instant message and the like). And when the query result is the same as the preset result, determining that the test is successful.
Compared with the prior art, the invention realizes the automatic test of each server 2 and database 3 in the system by running the server connection test program, the application test program and the database running test program in real time or at regular time. In addition, when the running result of the test program does not meet the preset condition, the invention sends out prompt information, which is helpful for system maintenance personnel to discover and process system errors in time.
Further, in this embodiment, the method further includes steps S40 to S70 (not shown in the drawings), wherein:
step S40, detecting whether the newly added server log information exists in the server 2 in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists.
The server log information includes access records, error records (for recording error information at runtime), and the like. The error records in the server log information may record not only the internal run-time error information of the server 2, but also the running errors of the device or apparatus (e.g., database 3) interacting with the server 2.
And S50, determining the error type according to the extracted key field.
The manner of determining the error type at least comprises the following two schemes.
Scheme one:
and determining the error type corresponding to the extracted key field according to the mapping relation between the predetermined key field and the error type.
The mapping relationship between the key field and the error type can be shown in table one.
Table one:
the above examples are only used to aid understanding, and the user may set keywords and the mapping relationship between keywords and error types according to a specific application scenario.
Scheme II:
and judging whether a second preset key field (for example, "start") is extracted within a preset time period (for example, ten minutes) after the first preset key field (for example, "force shutdown successfully") is extracted, and if not, determining that the error type is abnormal STARTING of the server application program.
Step S60, inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts.
The mapping relation between the error type and the preset script can be shown in the second table.
And (II) table:
error type Preset script
Data anomalies Without any means for
Disk space is insufficient clearWlsStg.sh script
Database anomalies Database restart script
Server application launch exception reset. Sh script
The clearWlsStg.sh script is used to clean up disk space.
The database restart script is used to restart the database 3.
The reset.sh script is used for restarting the application program of the server 2, after restarting the server 2, inquiring a preset third key field (for example, server state changed to STARTING) from the updated server log information, and if so, feeding back the normal information of the application program of the server 2.
And step S70, running all the queried preset scripts.
According to the embodiment, the error type (namely the fault type) can be automatically identified by detecting the server log information, and the automatic repair of partial errors is realized by running the preset script, so that the consumption of human resources is reduced, and the availability and the stability of the system are improved.
Further, the present invention also proposes a computer readable storage medium storing an automated test program executable by at least one processor to cause the at least one processor to perform the automated test method of any of the embodiments described above.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structural changes made by the description of the present invention and the accompanying drawings or direct/indirect application in other related technical fields are included in the scope of the invention.

Claims (7)

1. A test device comprising a memory and a processor, wherein an automated test program is stored on the memory, the automated test program when executed by the processor performing the steps of:
a first test step: running a predetermined server connection test program, acquiring and analyzing the running result of the server connection test program, and sending out prompt information when the running result of the server connection test program does not meet a first preset condition;
and a second testing step: after communication connection is established with a server, a predetermined application test program is operated at regular time, an operation result of the application test program is obtained and analyzed, and when the operation result of the application test program does not meet a second preset condition, prompt information is sent out;
and a third testing step: establishing communication connection with a database, running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result of the database running test program does not meet a third preset condition;
wherein the first testing step comprises: the simulation client sends a plurality of communication connection requests to the server, and obtains connection results of the communication connection requests, wherein the connection results comprise connection success and connection failure; calculating a communication connection success rate according to the connection result, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating a communication connection failure rate, and taking the communication connection failure rate as an operation result of the server connection test program; judging whether the operation result of the server connection test program meets a first preset condition or not, if not, sending prompt information, wherein the first preset condition is that the communication connection success rate is larger than a first preset threshold value when the operation result of the server connection test program is the communication connection success rate, and the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value when the operation result of the server connection test program is the communication connection failure rate;
the second testing step includes: after communication connection is established with the server, the simulation client calls a login interface of the server and sends a user login request carrying login information to the server; receiving a login result returned by a server, wherein the login result comprises login success and login failure; and when the login result is login failure, sending out prompt information.
2. The test apparatus of claim 1, wherein the processor executing the automated test program further performs the steps of:
detecting whether newly added server log information exists in a server in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists in the server;
determining an error type according to the extracted key field;
inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts;
and running all the queried preset scripts.
3. The test device according to claim 1 or 2, wherein the third test step comprises:
establishing communication connection with a database, calling a query interface of the database, and sending a query request to the database;
receiving a query result returned by the database;
judging whether the query result is the same as a preset result;
and sending out prompt information when the query result is different from the preset result.
4. An automated testing method suitable for use in a testing apparatus, the method comprising:
a first test step: running a predetermined server connection test program, acquiring and analyzing the running result of the server connection test program, and sending out prompt information when the running result of the server connection test program does not meet a first preset condition;
and a second testing step: after communication connection is established with a server, a predetermined application test program is operated at regular time, an operation result of the application test program is obtained and analyzed, and when the operation result of the application test program does not meet a second preset condition, prompt information is sent out;
and a third testing step: establishing communication connection with a database, running a predetermined database running test program, acquiring and analyzing the running result of the database running test program, and sending out prompt information when the running result of the database running test program does not meet a third preset condition;
wherein the first testing step comprises: the simulation client sends a plurality of communication connection requests to the server, and obtains connection results of the communication connection requests, wherein the connection results comprise connection success and connection failure; calculating a communication connection success rate according to the connection result, and taking the communication connection success rate as an operation result of the server connection test program; or, according to the connection result, calculating a communication connection failure rate, and taking the communication connection failure rate as an operation result of the server connection test program; judging whether the operation result of the server connection test program meets a first preset condition or not, if not, sending prompt information, wherein the first preset condition is that the communication connection success rate is larger than a first preset threshold value when the operation result of the server connection test program is the communication connection success rate, and the first preset condition is that the communication connection failure rate is smaller than a second preset threshold value when the operation result of the server connection test program is the communication connection failure rate;
the second testing step includes: after communication connection is established with the server, the simulation client calls a login interface of the server and sends a user login request carrying login information to the server; receiving a login result returned by a server, wherein the login result comprises login success and login failure; and when the login result is login failure, sending out prompt information.
5. The automated test method of claim 4, further comprising:
detecting whether newly added server log information exists in a server in real time or at regular time, and extracting key fields from the server log information when the newly added server log information exists in the server;
determining an error type according to the extracted key field;
inquiring the preset scripts corresponding to each error type in all the preset scripts according to the mapping relation between the predetermined error type and the preset scripts;
and running all the queried preset scripts.
6. The automated test method of claim 4 or claim 5, wherein the third test step comprises:
establishing communication connection with a database, calling a query interface of the database, and sending a query request to the database;
receiving a query result returned by the database;
judging whether the query result is the same as a preset result;
and sending out prompt information when the query result is different from the preset result.
7. A computer readable storage medium, characterized in that the computer readable storage medium stores an automated test program executable by at least one processor to cause the at least one processor to perform the steps of the automated test method according to any of claims 4-6.
CN201811513480.6A 2018-12-11 2018-12-11 Test apparatus, automated test method, and computer-readable storage medium Active CN109669867B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811513480.6A CN109669867B (en) 2018-12-11 2018-12-11 Test apparatus, automated test method, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811513480.6A CN109669867B (en) 2018-12-11 2018-12-11 Test apparatus, automated test method, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN109669867A CN109669867A (en) 2019-04-23
CN109669867B true CN109669867B (en) 2024-03-12

Family

ID=66143694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811513480.6A Active CN109669867B (en) 2018-12-11 2018-12-11 Test apparatus, automated test method, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN109669867B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112783677A (en) * 2019-11-04 2021-05-11 北京京东尚科信息技术有限公司 Method and device for monitoring service abnormity

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090122665A (en) * 2008-05-26 2009-12-01 대진대학교 산학협력단 A test system for handset applications using test libraries and the method thereof
CN103902429A (en) * 2012-12-26 2014-07-02 北京新媒传信科技有限公司 Early warning method, sever and system in automated test
CN105718369A (en) * 2016-01-19 2016-06-29 国家电网公司 Computer software test abnormity processing system and test method
CN106549824A (en) * 2016-10-09 2017-03-29 武汉斗鱼网络科技有限公司 A kind of system and method for test software and server connective stability
CN107704398A (en) * 2017-11-01 2018-02-16 网易(杭州)网络有限公司 Automated testing method and device, storage medium, electronic equipment
CN108009087A (en) * 2017-11-29 2018-05-08 广州品唯软件有限公司 Data library test method, device and computer-readable recording medium
CN108052451A (en) * 2017-12-26 2018-05-18 网易(杭州)网络有限公司 Test method, system, test server, test terminal and storage medium
CN108519943A (en) * 2018-03-06 2018-09-11 平安科技(深圳)有限公司 Testing and control and test execution device, method and computer storage media

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090122665A (en) * 2008-05-26 2009-12-01 대진대학교 산학협력단 A test system for handset applications using test libraries and the method thereof
CN103902429A (en) * 2012-12-26 2014-07-02 北京新媒传信科技有限公司 Early warning method, sever and system in automated test
CN105718369A (en) * 2016-01-19 2016-06-29 国家电网公司 Computer software test abnormity processing system and test method
CN106549824A (en) * 2016-10-09 2017-03-29 武汉斗鱼网络科技有限公司 A kind of system and method for test software and server connective stability
CN107704398A (en) * 2017-11-01 2018-02-16 网易(杭州)网络有限公司 Automated testing method and device, storage medium, electronic equipment
CN108009087A (en) * 2017-11-29 2018-05-08 广州品唯软件有限公司 Data library test method, device and computer-readable recording medium
CN108052451A (en) * 2017-12-26 2018-05-18 网易(杭州)网络有限公司 Test method, system, test server, test terminal and storage medium
CN108519943A (en) * 2018-03-06 2018-09-11 平安科技(深圳)有限公司 Testing and control and test execution device, method and computer storage media

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于脚本的流程自适应自动化测试研究;王游;冯曙明;何金陵;方泉;;计算机与现代化;20150215(02);98-103 *

Also Published As

Publication number Publication date
CN109669867A (en) 2019-04-23

Similar Documents

Publication Publication Date Title
US9824002B2 (en) Tracking of code base and defect diagnostic coupling with automated triage
US11544137B2 (en) Data processing platform monitoring
CN107870860B (en) Buried point verification system and method
WO2019071891A1 (en) Code coverage analysis method and application server
JP7289334B2 (en) Methods and apparatus, electronic devices, storage media and computer programs for testing code
CN112286806B (en) Automatic test method and device, storage medium and electronic equipment
CN111198769A (en) Information processing method and system, computer system and computer readable medium
CN109783324B (en) System operation early warning method and device
US20170192840A1 (en) Computer device error instructions
CN110647471A (en) Interface test case generation method, electronic device and storage medium
CN108255735B (en) Associated environment testing method, electronic device and computer readable storage medium
CN110740071B (en) Method, device and system for monitoring network interface
CN105955838A (en) System halt reason check method and device
CN111309743A (en) Report pushing method and device
CN109669867B (en) Test apparatus, automated test method, and computer-readable storage medium
US20160085664A1 (en) Generating a fingerprint representing a response of an application to a simulation of a fault of an external service
CN107908525B (en) Alarm processing method, equipment and readable storage medium
CN116431731A (en) Data asynchronous export method, device, equipment and storage medium thereof
US20230066698A1 (en) Compute instance warmup operations
CN113378180A (en) Vulnerability detection method and device, computer equipment and readable storage medium
CN113326421B (en) Data identification method and device for record carrier, electronic equipment and storage medium
CN113377719B (en) System abnormal shutdown time acquisition method and system
CN116483566A (en) Resource processing method and device for server, electronic equipment and storage medium
CN116414718A (en) Adjustable interface method embedded in integrated development environment and computer equipment
CN113849291A (en) Container cluster-based task processing method, device, equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant