CN117493203A - Method, device, equipment and storage medium for testing server software - Google Patents

Method, device, equipment and storage medium for testing server software Download PDF

Info

Publication number
CN117493203A
CN117493203A CN202311528339.4A CN202311528339A CN117493203A CN 117493203 A CN117493203 A CN 117493203A CN 202311528339 A CN202311528339 A CN 202311528339A CN 117493203 A CN117493203 A CN 117493203A
Authority
CN
China
Prior art keywords
test
user
platform
software
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311528339.4A
Other languages
Chinese (zh)
Inventor
郝思敏
秦晓宁
陈颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nettrix Information Industry Beijing Co Ltd
Original Assignee
Nettrix Information Industry Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nettrix Information Industry Beijing Co Ltd filed Critical Nettrix Information Industry Beijing Co Ltd
Priority to CN202311528339.4A priority Critical patent/CN117493203A/en
Publication of CN117493203A publication Critical patent/CN117493203A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a server software testing method, a device, equipment and a storage medium, comprising the following steps: after detecting that a user logs in an automatic test platform, acquiring a test task created in the platform by the user; the test task comprises software to be tested, a server corresponding to the software to be tested and a test plan; acquiring test items corresponding to the test plan and test scripts corresponding to the test items, and executing the test scripts according to the test trigger time configured by a user; and counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts, and generating a test report corresponding to the test tasks according to the execution parameters. The technical scheme of the embodiment of the invention can provide a mode for automatically testing the server software, reduce the human resources and time resources consumed in the testing process of the server software, and improve the testing efficiency of the server software and the accuracy of the testing result.

Description

Method, device, equipment and storage medium for testing server software
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for testing server software.
Background
With the wide application of server software, it is particularly important to test the quality and stability of the server software periodically.
In the existing server software testing method, a manual mode is generally adopted to test the server software codes, omission or errors are easy to occur, and an automatic testing platform can greatly improve the coverage range and accuracy of the testing process.
There are many automated test platforms for desktop applications, mobile applications, and Web applications, such as automated test tools Selenium, jmeter, loadRunner. However, since the desktop application, the mobile application, and the Web application are significantly different from the test items and the test methods corresponding to the server software, the above test tools cannot directly meet the automatic test requirements of the server software.
Disclosure of Invention
The invention provides a method, a device, equipment and a storage medium for testing server software, which can provide a mode for automatically testing the server software, reduce human resources and time resources consumed in the testing process of the server software, and improve the testing efficiency of the server software and the accuracy of a testing result.
According to an aspect of the present invention, there is provided a server software testing method, including:
after detecting that a user logs in an automatic test platform, acquiring a test task created in the platform by the user;
the test task comprises software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server;
acquiring at least one test item corresponding to the test plan and a test script corresponding to each test item, and executing each test script according to the test trigger time configured by a user;
and counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts, and generating a test report corresponding to the test tasks according to the execution parameters.
Optionally, according to the execution result corresponding to each test script, counting the execution parameters corresponding to the test task, including:
after the execution of all the test scripts is finished, counting the total number of the test scripts, the number of the test scripts successfully executed, the total number of the test items and the number of the completed test items;
the generating a test report corresponding to the test task according to the execution parameters comprises the following steps:
and generating a test report corresponding to the test task according to the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items.
Optionally, the test plan includes test cases matched with the software to be tested and the server;
obtaining at least one test item corresponding to the test plan, including:
acquiring at least one test item corresponding to the software to be tested according to the test case in the test plan by using a case management tool deployed in an automatic test platform;
the test types corresponding to the test items comprise a functional test, a performance test and a safety test.
Optionally, while executing each test script, the method further includes:
displaying the testing progress corresponding to the testing task to a user through a visual interface;
after generating the test report corresponding to the test task according to the execution parameter, the method further comprises:
and displaying the test report corresponding to the test task to a user through a visual interface.
Optionally, after detecting that the user logs in the automated test platform, acquiring a test task created by the user includes:
after detecting that a user logs in an automatic test platform, identifying the identity of the user according to login information of the user;
and if the user belongs to the tester, acquiring the test task created by the user.
Optionally, after detecting that the user logs in the automated test platform, before acquiring the test task created by the user, the method further includes:
after detecting that an administrator logs in an automatic test platform, responding to personnel authority configuration information triggered by the administrator, and dividing users corresponding to the automatic test platform into test personnel and non-test personnel according to the personnel authority configuration information;
responding to the version updating information of the testing tool triggered by the administrator, and updating the testing tool deployed in the automatic testing platform according to the version updating information of the testing tool.
Optionally, the front end frame of the automatic test platform is constructed based on a reaction frame and a JavaScript programming language; the rear end framework of the automatic test platform is constructed based on Python language and Django framework;
the automated test platform integrates a code and document management tool Gitlab and a code version management tool (SVN) for managing the version of the test tool deployed in the automated test platform.
According to another aspect of the present invention, there is provided a server software testing apparatus, the apparatus comprising:
the task creation module is used for acquiring a test task created by a user in the automatic test platform after detecting that the user logs in the automatic test platform;
the test task comprises software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server;
the script execution module is used for acquiring at least one test item corresponding to the test plan and a test script corresponding to each test item, and executing each test script according to the test trigger time configured by a user;
and the report generation module is used for counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts and generating a test report corresponding to the test tasks according to the execution parameters.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the server software testing method of any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a method for testing server software according to any one of the embodiments of the present invention.
According to the technical scheme provided by the embodiment of the invention, after the user logs in the automatic test platform, the test task created by the user in the platform is obtained, at least one test item corresponding to the test plan in the test task and the test scripts corresponding to the test items are obtained, each test script is executed according to the test trigger time configured by the user, the execution parameters corresponding to the test task are counted according to the execution results corresponding to each test script, and the technical means of generating the test report corresponding to the test task according to the execution parameters are provided, so that the automatic test mode for the server software is provided, the manpower resources and time resources consumed in the test process of the server software can be reduced, and the test efficiency of the server software and the accuracy of the test results are improved; the method can enable a user to quickly find defect problems in the development process of the server software, improve the quality of the software and accelerate the delivery time and the online time of the software; coverage rate of the server software testing process can be improved; the user can conveniently and quickly operate the automatic test platform; the user can intuitively check the test progress and the test report of the test task; the safety of the operation process of the automatic test platform is ensured.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for testing server software according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for testing server software provided in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of another method for testing server software provided in accordance with an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a server software testing device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing a server software testing method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a flowchart of a server software testing method according to an embodiment of the present invention, where the method may be implemented by a server software testing device, and the server software testing device may be implemented in hardware and/or software, and the server software testing device may be configured in an electronic device. As shown in fig. 1, the method includes:
and 110, after detecting that the user logs in the automatic test platform, acquiring a test task created in the platform by the user.
In this embodiment, the automated test platform may be a Web application developed for server software testing. The automatic test platform is provided with a simple and easy-to-use man-machine interaction interface and a rich and complete test case library, so that the coverage rate of the server software test process can be enhanced, and the test efficiency and the accuracy of the test result are greatly improved. Meanwhile, the automatic test platform also supports a plurality of different test environments, including different operating systems, browsers, servers and the like, and has good universality and expansibility.
In a specific embodiment, if the user has a test requirement for the server software, the automated test platform may be logged in as a web page access on the computer device developing the software and test tasks created in the automated test platform.
In one implementation manner of this embodiment, the test task includes software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server.
Specifically, the user can add, delete or modify servers corresponding to the software to be tested through a host management module deployed in the automatic test platform, and monitor the running state corresponding to each server. In addition, the user can also create a test plan, such as setting test environment parameters, test data, and the like, through a batch execution module deployed in the automated test platform.
Step 120, obtaining at least one test item corresponding to the test plan and test scripts corresponding to the test items, and executing the test scripts according to the test trigger time configured by the user.
In this embodiment, test scripts corresponding to different test items respectively are stored in the automated test platform in advance. After each test item corresponding to the current test plan is obtained, a test script matched with each test item in the current test plan can be determined according to the mapping relation between each test item and the test script pre-stored in the automatic test platform, and each test script is executed according to the test trigger time configured by a user.
The method has the advantages that the test script matched with the test task can be obtained quickly through the mapping relation between the pre-stored test items and the test script, the time consumption for obtaining the test script is saved, and the test efficiency of the server software is improved.
In a specific embodiment, the user-configured test trigger time may be a timed trigger (e.g., 2023, 05, 28, 16:00 pm trigger), a timed trigger within a fixed period of time (e.g., 2023, 05, 28, 2023, 6, 15 days, 16:00 daily trigger), or a trigger at preset time intervals (e.g., once every 2 hours).
The method has the advantages that the parallel test of the batch servers can be realized by executing a plurality of test scripts according to the test trigger time configured by the user, the resource consumption caused by the test process of the server software is saved, and the flexibility of the test method of the server software is improved.
And 130, counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts, and generating a test report corresponding to the test tasks according to the execution parameters.
In this embodiment, after execution of each test script is completed, the number of scripts that are successfully executed, the number of scripts that are failed to be executed, and the like (i.e., the execution parameters) may be counted according to the execution results corresponding to each test script, and then a test report is generated according to the execution parameters, so as to help the user to quickly locate the problems existing in the server software and optimize the software performance.
According to the technical scheme provided by the embodiment of the invention, after the user logs in the automatic test platform, the test task created by the user in the platform is obtained, at least one test item corresponding to the test plan in the test task and the test scripts corresponding to the test items are obtained, each test script is executed according to the test trigger time configured by the user, the execution parameters corresponding to the test task are counted according to the execution results corresponding to each test script, and the test report corresponding to the test task is generated according to the execution parameters.
Fig. 2 is a flowchart of another method for testing server software according to an embodiment of the present invention, as shown in fig. 2, where the method includes:
step 210, after detecting that a user logs in an automatic test platform, obtaining a test task created in the platform by the user.
In this embodiment, the test task includes software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server. The test plan comprises test cases, test tools and the like which are matched with the software to be tested and the server.
In one implementation manner of the embodiment, a front end frame of the automatic test platform is constructed based on a reaction frame and a JavaScript programming language; the back-end framework of the automatic test platform is constructed based on Python language and Django framework. Code and document management tools Gitlab and SVN are integrated in the automatic test platform and are used for managing the version of the test tool deployed in the automatic test platform.
The reaction framework is a JavaScript library for constructing a User Interface, and by separating the running state of the automatic test platform from a User Interface (UI) component, the complex interactive User Interface is easier to create, so that a User can conveniently and rapidly operate the automatic test platform. The Django framework is a powerful Web program development framework and can realize rapid construction of complex Web application programs. By using the Django framework as a back-end framework of the automatic test platform, the platform is convenient to effectively process data such as user information, test tasks, test results, test cases and the like, and the data are stored in the MySQL database.
In addition, the automatic test platform is also integrated with a code and document management tool Gitlab, SVN and other multiterminal script storage warehouse for managing the version of the test tool deployed in the automatic test platform, so that the consistency of the version of the test tool in the automatic test platform can be ensured all the time, the non-uniformity of test variables caused by the non-uniformity of the version is avoided, and the accuracy of the test result of the server software can be improved.
And 220, acquiring at least one test item corresponding to the software to be tested according to the test case in the test plan by using a case management tool deployed in an automatic test platform.
In this embodiment, after the test case included in the test plan is obtained, at least one test item matched with the test case may be obtained through a case management tool pre-deployed in the automation test platform, where a test type corresponding to the test item includes a functional test, a performance test, and a security test.
The method has the advantages that the coverage rate of the server software testing process can be improved, users can find the defect problems in the server software development process in time, and the software quality is improved rapidly.
Step 230, obtaining test scripts corresponding to the test items, and executing the test scripts according to the test trigger time configured by the user.
Step 240, after execution of all test scripts is finished, counting the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items.
Step 250, generating a test report corresponding to the test task according to the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items.
In this embodiment, optionally, an Excel or Txt text format test report may be generated according to the total number of test scripts, the number of test scripts successfully executed, the total number of test items, and the number of completed test items, so that a user can quickly find a defect problem in the server software development process, improve the software quality, and accelerate the delivery time and the online time of the software.
According to the technical scheme provided by the embodiment of the invention, after the user logs in the automatic test platform, the test tasks created in the platform by the user are obtained, the use case management tool deployed in the automatic test platform is used for obtaining at least one test item corresponding to the software to be tested according to the test cases, the test scripts corresponding to the test items are obtained, the test scripts are executed according to the test trigger time configured by the user, after the execution of all the test scripts is finished, the total number of the test scripts, the number of the test scripts which are successfully executed, the total number of the test items and the number of the completed test items are counted, and the technical means of generating the test report corresponding to the test tasks are generated according to the total number of the test scripts, the number of the test items which are successfully executed, the total number of the test items and the number of the completed test items, so that the human resources consumed in the test process of the server software can be reduced, the test efficiency of the server software and the accuracy of the test results are improved, and the delivery time and the online time of the server software are accelerated.
Fig. 3 is a flowchart of another method for testing server software according to an embodiment of the present invention, as shown in fig. 3, where the method includes:
step 310, after detecting that a user logs in an automatic test platform, identifying the identity of the user according to the login information of the user.
In this embodiment, the identity of the user may be identified by a system management module in the automated test platform. Specifically, the login account of the user can be compared with the user information stored in the database, so that the identity recognition result of the user is obtained. The database stores mapping relations between different user login accounts and identity information in advance.
Step 320, if the user belongs to a tester, acquiring a test task created by the user.
The automatic test platform has the advantages that after the user is detected to log in the automatic test platform, the safety and stability of the operation process of the automatic test platform can be ensured by identifying the identity of the user.
In one implementation of this embodiment, before detecting that the user logs into the automated test platform, the method further includes: after detecting that an administrator logs in an automatic test platform, responding to personnel authority configuration information triggered by the administrator, and dividing users corresponding to the automatic test platform into test personnel and non-test personnel according to the personnel authority configuration information; responding to the version updating information of the testing tool triggered by the administrator, and updating the testing tool deployed in the automatic testing platform according to the version updating information of the testing tool.
In this embodiment, specifically, an administrator may configure, through a system management module in the automated test platform, information such as user permission, access control manner, and test tool version corresponding to the automated test platform. The advantage of this arrangement is that the safety and compliance of the automated test platform operation process can be ensured.
Step 330, obtaining at least one test item corresponding to the test plan and test scripts corresponding to the test items, executing the test scripts according to the test trigger time configured by the user, and displaying the test progress corresponding to the test task to the user through a visual interface.
In this embodiment, in the process of executing each test script, the current running state and the test progress of the platform may be displayed through a visual interface by automating a Dashboard module in the test platform.
And 340, counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts, and generating a test report corresponding to the test tasks according to the execution parameters.
And 350, displaying the test report corresponding to the test task to a user through a visual interface.
In this step, specifically, a test report corresponding to the test task may be displayed through a Dashboard module in the automated test platform by using a visual interface.
The method has the advantages that a user can intuitively check the test progress and the test report of the test task, so that the defect problem existing in the development process of the server software is rapidly solved.
In a specific embodiment, after generating the test report corresponding to the test task according to the execution parameter, the test report may also be uploaded to a preset storage location. By reserving the test report of the server software, the user can download the test report, and the user can conveniently perform work docking, communication, reference and the like.
According to the technical scheme provided by the embodiment of the invention, after the user logs in the automatic test platform is detected, the identity of the user is identified according to the login information of the user, if the user belongs to a tester, the test tasks, the test items and the test scripts corresponding to the test items created by the user are acquired, the test scripts are executed according to the test trigger time configured by the user, the test progress corresponding to the test tasks is displayed to the user through the visual interface, the execution parameters corresponding to the test tasks are counted according to the execution results corresponding to the test scripts, the test report corresponding to the test tasks is generated according to the execution parameters, and the test report is displayed to the user through the visual interface.
Fig. 4 is a schematic structural diagram of a server software testing device according to an embodiment of the present invention, where the device is applied to an electronic apparatus, as shown in fig. 4, and the device includes: a task creation module 410, a script execution module 420, and a report generation module 430.
The task creation module 410 is configured to obtain a test task created by a user in an automated test platform after detecting that the user logs in the platform;
the test task comprises software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server;
the script execution module 420 is configured to obtain at least one test item corresponding to the test plan and a test script corresponding to each test item, and execute each test script according to a test trigger time configured by a user;
and the report generating module 430 is configured to count execution parameters corresponding to the test tasks according to execution results corresponding to the test scripts, and generate a test report corresponding to the test tasks according to the execution parameters.
According to the technical scheme provided by the embodiment of the invention, after the user logs in the automatic test platform, the test task created by the user in the platform is obtained, at least one test item corresponding to the test plan in the test task and the test scripts corresponding to the test items are obtained, the test scripts are executed according to the test trigger time configured by the user, the execution parameters corresponding to the test task are counted according to the execution results corresponding to the test scripts, and the test report corresponding to the test task is generated according to the execution parameters.
On the basis of the embodiment, the test plan comprises test cases matched with the software to be tested and the server. The front end framework of the automatic test platform is constructed based on a real framework and a JavaScript programming language; the back-end framework of the automatic test platform is constructed based on Python language and Django framework. And the automatic test platform is integrated with the Gitlab and the SVN and is used for managing the version of the test tool deployed in the automatic test platform.
The apparatus further comprises:
the role division module is used for responding to personnel authority configuration information triggered by an administrator after detecting that the administrator logs in the automatic test platform, and dividing users corresponding to the automatic test platform into test personnel and non-test personnel according to the personnel authority configuration information;
and the test tool management module is used for responding to the version updating information of the test tool triggered by the administrator and updating the test tool deployed in the automatic test platform according to the version updating information of the test tool.
The task creation module 410 includes:
the identity identification unit is used for identifying the identity of the user according to the login information of the user after detecting that the user logs in the automatic test platform;
and the task acquisition unit is used for acquiring the test task created by the user if the user belongs to the tester.
The script execution module 420 includes:
the test item acquisition unit is used for acquiring at least one test item corresponding to the software to be tested according to the test case in the test plan through a case management tool deployed in the automatic test platform; the test types corresponding to the test items comprise a functional test, a performance test and a safety test;
and the test progress display unit is used for displaying the test progress corresponding to the test task to a user through a visual interface while executing each test script.
The report generation module 430 includes:
the parameter statistics unit is used for counting the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items after the execution of all the test scripts is finished;
the parameter processing unit is used for generating a test report corresponding to the test task according to the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items;
and the report display unit is used for displaying the test report corresponding to the test task to a user through a visual interface.
The device can execute the method provided by all the embodiments of the invention, and has the corresponding functional modules and beneficial effects of executing the method. Technical details not described in detail in the embodiments of the present invention can be found in the methods provided in all the foregoing embodiments of the present invention.
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the server software testing method.
In some embodiments, the server software testing method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the server software testing method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the server software testing method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method for testing server software, the method comprising:
after detecting that a user logs in an automatic test platform, acquiring a test task created in the platform by the user;
the test task comprises software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server;
acquiring at least one test item corresponding to the test plan and a test script corresponding to each test item, and executing each test script according to the test trigger time configured by a user;
and counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts, and generating a test report corresponding to the test tasks according to the execution parameters.
2. The method according to claim 1, wherein counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts comprises:
after the execution of all the test scripts is finished, counting the total number of the test scripts, the number of the test scripts successfully executed, the total number of the test items and the number of the completed test items;
the generating a test report corresponding to the test task according to the execution parameters comprises the following steps:
and generating a test report corresponding to the test task according to the total number of test scripts, the number of test scripts successfully executed, the total number of test items and the number of completed test items.
3. The method of claim 1, wherein the test plan includes test cases matching the software under test and a server;
obtaining at least one test item corresponding to the test plan, including:
acquiring at least one test item corresponding to the software to be tested according to the test case in the test plan by using a case management tool deployed in an automatic test platform;
the test types corresponding to the test items comprise a functional test, a performance test and a safety test.
4. The method of claim 1, wherein concurrently with executing each of the test scripts, further comprising:
displaying the testing progress corresponding to the testing task to a user through a visual interface;
after generating the test report corresponding to the test task according to the execution parameter, the method further comprises:
and displaying the test report corresponding to the test task to a user through a visual interface.
5. The method of claim 1, wherein after detecting that the user logs into the automated test platform, obtaining the user-created test task comprises:
after detecting that a user logs in an automatic test platform, identifying the identity of the user according to login information of the user;
and if the user belongs to the tester, acquiring the test task created by the user.
6. The method of claim 1, wherein after detecting that the user is logged into the automated test platform, prior to obtaining the user-created test task, further comprising:
after detecting that an administrator logs in an automatic test platform, responding to personnel authority configuration information triggered by the administrator, and dividing users corresponding to the automatic test platform into test personnel and non-test personnel according to the personnel authority configuration information;
responding to the version updating information of the testing tool triggered by the administrator, and updating the testing tool deployed in the automatic testing platform according to the version updating information of the testing tool.
7. The method of claim 1, wherein the front end framework of the automated test platform is built based on a act framework and JavaScript programming language; the rear end framework of the automatic test platform is constructed based on Python language and Django framework;
the automatic test platform integrates a code and document management tool Gitlab and a code version management tool SVN, and is used for managing the version of the test tool deployed in the automatic test platform.
8. A server software testing apparatus, the apparatus comprising:
the task creation module is used for acquiring a test task created by a user in the automatic test platform after detecting that the user logs in the automatic test platform;
the test task comprises software to be tested, at least one server corresponding to the software to be tested, and a test plan matched with the software to be tested and the server;
the script execution module is used for acquiring at least one test item corresponding to the test plan and a test script corresponding to each test item, and executing each test script according to the test trigger time configured by a user;
and the report generation module is used for counting the execution parameters corresponding to the test tasks according to the execution results corresponding to the test scripts and generating a test report corresponding to the test tasks according to the execution parameters.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the server software testing method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the server software testing method of any one of claims 1-7.
CN202311528339.4A 2023-11-16 2023-11-16 Method, device, equipment and storage medium for testing server software Pending CN117493203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311528339.4A CN117493203A (en) 2023-11-16 2023-11-16 Method, device, equipment and storage medium for testing server software

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311528339.4A CN117493203A (en) 2023-11-16 2023-11-16 Method, device, equipment and storage medium for testing server software

Publications (1)

Publication Number Publication Date
CN117493203A true CN117493203A (en) 2024-02-02

Family

ID=89672452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311528339.4A Pending CN117493203A (en) 2023-11-16 2023-11-16 Method, device, equipment and storage medium for testing server software

Country Status (1)

Country Link
CN (1) CN117493203A (en)

Similar Documents

Publication Publication Date Title
CN116303013A (en) Source code analysis method, device, electronic equipment and storage medium
CN116467161A (en) Application testing method and device, electronic equipment and storage medium
CN115437961A (en) Data processing method and device, electronic equipment and storage medium
CN117493203A (en) Method, device, equipment and storage medium for testing server software
CN115017047A (en) Test method, system, equipment and medium based on B/S architecture
CN115481594A (en) Score board implementation method, score board, electronic device and storage medium
CN114003497A (en) Method, device and equipment for testing service system and storage medium
CN113836016B (en) Writing interface storage quality testing method, system, electronic equipment and storage medium
CN115098405B (en) Software product evaluation method and device, electronic equipment and storage medium
CN115374010A (en) Function testing method, device, equipment and storage medium
CN116401113B (en) Environment verification method, device and medium for heterogeneous many-core architecture acceleration card
CN114238149A (en) Batch testing method of accounting system, electronic device and storage medium
CN117724980A (en) Method and device for testing software framework performance, electronic equipment and storage medium
CN117648252A (en) Function test method and device for software application, electronic equipment and storage medium
CN115543748A (en) Signal testing method, device, equipment and storage medium
CN117033234A (en) Interface testing method, device, equipment and medium
CN116594827A (en) Automatic test method and device for hardware equipment, workbench, equipment and medium
CN116596558A (en) Product management method, device, equipment and storage medium
CN116719719A (en) Test method, test device, electronic equipment and storage medium
CN117632120A (en) Processing system, method, equipment and storage medium for report data
CN115934550A (en) Test method, test device, electronic equipment and storage medium
CN115509932A (en) Interface test case generation method, device, equipment and medium
CN118093368A (en) Data quality detection method and device, electronic equipment and storage medium
CN116225390A (en) Warehouse-in method, device, equipment and medium for software development files
CN115238212A (en) Page designated element detection and replacement method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination