CN111177005A - Service application testing method, device, server and storage medium - Google Patents

Service application testing method, device, server and storage medium Download PDF

Info

Publication number
CN111177005A
CN111177005A CN201911409550.8A CN201911409550A CN111177005A CN 111177005 A CN111177005 A CN 111177005A CN 201911409550 A CN201911409550 A CN 201911409550A CN 111177005 A CN111177005 A CN 111177005A
Authority
CN
China
Prior art keywords
test
response message
preset
service application
plug
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911409550.8A
Other languages
Chinese (zh)
Other versions
CN111177005B (en
Inventor
朱晓峰
王宇超
陈志�
陈小秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN201911409550.8A priority Critical patent/CN111177005B/en
Priority claimed from CN201911409550.8A external-priority patent/CN111177005B/en
Publication of CN111177005A publication Critical patent/CN111177005A/en
Application granted granted Critical
Publication of CN111177005B publication Critical patent/CN111177005B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The embodiment of the application provides a method, a device, a server and a storage medium for testing service application, wherein the method comprises the following steps: receiving and responding a test request aiming at a target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language; determining a target server with a target service application deployed according to a preset test case file through a test plug-in, and sending test message data to the target server; receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, and comparing the test response message data according to a preset test case file to obtain a comparison processing result; and determining a test result aiming at the target service application according to the comparison processing result. Therefore, the user operation can be simplified, and the service application test can be automatically and efficiently carried out.

Description

Service application testing method, device, server and storage medium
Technical Field
The present application relates to the field of service data processing technologies, and in particular, to a method and an apparatus for testing a service application, a server, and a storage medium.
Background
In many business processing scenarios, for example, in a scenario of online transaction of a bank, in order to facilitate online transaction of a user and improve more comprehensive and discreet business services for the user, a server of the bank may issue some associated business applications online to the outside. For example, a mobile banking APP, or a small application integrated in a certain APP and implementing a specific business service function, for example, an application integrated in a shopping APP and querying account balance online, etc.
Before a server of a bank issues specific service applications online, the server of the bank tests the service applications to be issued online, and the service applications are formally issued online after the test is passed, so that users can download and use the service applications. If the test is not passed, the service application is temporarily not released online.
There is a need for a method for automatically and efficiently testing business applications.
Disclosure of Invention
The embodiment of the application provides a method and a device for testing a service application, a server and a storage medium, so as to solve the technical problems that the testing process is complicated and complex when a target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method, and achieve the technical effects of simplifying user operation and automatically and efficiently testing the service application.
The embodiment of the application provides a method for testing service application, which comprises the following steps:
receiving and responding a test request aiming at a target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language;
determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server;
receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result;
and determining a test result aiming at the target service application according to the comparison processing result.
In one embodiment, the preset test case file includes at least one of: the system comprises a request message, a standard response message aiming at the request message, an IP address of a target server deployed with target service application and service port data.
In one embodiment, invoking and generating test message data according to a preset test case file includes:
analyzing and extracting field information of a plurality of characteristic fields contained in the request message in the preset test case file;
and generating the test message data in a first Sheet page according to the field information of the plurality of characteristic fields contained in the request message.
In one embodiment, the preset test case file further includes: and preset identification information indicating the characteristic fields to be compared.
In one embodiment, the comparing the response message data according to the preset test case file to obtain a corresponding comparison result includes:
according to the identification information, extracting field information of the same characteristic field from the test response message data and the standard response message respectively to carry out similarity comparison to obtain corresponding comparison results;
and generating a corresponding comparison processing result according to the comparison result.
In one embodiment, generating a corresponding comparison processing result according to the comparison result includes:
and generating a record list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be greater than or equal to a preset similarity threshold according to the comparison result.
In one embodiment, generating a corresponding comparison processing result according to the comparison result further includes:
and generating a problem list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be smaller than a preset similarity threshold according to the comparison result.
In one embodiment, before receiving and responding to a test request for a target business application, the method comprises:
receiving and responding to a preset generation request of a test case file, reading a request message input by a user, and an IP address and service port data of a target server with target service application deployed through a test plug-in;
calling the test plug-in, and sending the request message to a target server according to the IP address and the service port data of the target server;
receiving a response message fed back by the target server aiming at the request message through the test plug-in;
receiving a judgment instruction aiming at the response message, and determining whether the response message is a standard response message meeting the preset requirement according to the judgment instruction;
and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file in a second Sheet page according to the request message, the standard response message, the IP address of the target server and the service port data.
In one embodiment, after determining a test result for the target service application according to the comparison processing result, the method further includes:
and adjusting the target service application and/or the target server according to the test result.
The embodiment of the present application further provides a device for testing a service application, including:
the generating module is used for receiving and responding to a test request aiming at the target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language;
the first determining module is used for determining a target server with a target service application according to the preset test case file through a test plug-in and sending the test message data to the target server;
the comparison processing module is used for receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result;
and the second determining module is used for determining a test result aiming at the target service application according to the comparison processing result.
The embodiment of the application also provides a server, which comprises a processor and a memory for storing executable instructions of the processor, wherein the processor receives and responds to a test request aiming at a target service application when executing the instructions, and generates test message data by calling and according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and performs logic control through a VBA language; determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server; receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result; and determining a test result aiming at the target service application according to the comparison processing result.
The embodiment of the application also provides a computer-readable storage medium, wherein computer instructions are stored on the computer-readable storage medium, and when executed, the instructions realize receiving and responding to a test request for a target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and performs logic control through a VBA language; determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server; receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result; and determining a test result aiming at the target service application according to the comparison processing result.
In the embodiment of the application, a lightweight plug-in based on Excel and logically controlled by a VBA language is introduced and utilized as a test plug-in to participate in the test for the target service application, and specifically, when a test request for the target service application is received, test message data for the test can be automatically generated by calling the test plug-in and according to a preset test case file; determining a target server with a target service application deployed according to a preset test case file through the test plug-in, and further sending the test message data to the target server; receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, and comparing the test response message data by using a preset test case file to obtain corresponding comparison processing; and determining a final test result according to the comparison processing result. Therefore, the technical problems that the testing process is complicated and complex when the target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method are solved. The technical effects of simplifying user operation and automatically and efficiently testing the target service application are achieved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 is a process flow diagram of a method for testing a business application provided in an embodiment of the present application;
fig. 2 is a flowchart illustrating a process of accumulating preset test case files according to a test method for business applications provided in an embodiment of the present application;
fig. 3 is a block diagram of a test apparatus for a service application provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a server component of a test method for a service application provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of an embodiment of a test method for applying a business application provided by an embodiment of the present application in a specific scenario example;
FIG. 6 is a schematic diagram of an embodiment of a test method for applying a business application provided by an embodiment of the present application in a specific scenario example;
fig. 7 is a schematic diagram of an embodiment of a test method for applying the service application provided by the embodiment of the present application in a specific scenario example.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Considering that when testing a service application (such as an online transaction service application) to be released online based on the existing method, a third-party test software is often additionally introduced and installed, and then a specific online test can be performed on the service application by relying on the third-party test software. Moreover, the third-party testing software is often single in function and does not support expansion, so that the corresponding automatic testing of the service application cannot be performed according to specific testing requirements. Moreover, the third-party testing software has a relatively large volume, and is easy to burden devices such as servers participating in testing. Therefore, when the existing method is implemented, the technical problems that the testing process is complex and complicated when the business application is tested, the testing efficiency is low, and the automatic testing cannot be realized often exist.
Aiming at the root cause of the technical problems, the method considers that the existing functional characteristics of conventional office software Excel can be fully utilized, and a lightweight plug-in which logic control is carried out through a VBA language and is based on Excel is introduced and utilized as a test plug-in, so that third-party test software depended on by the existing method is replaced, and the test for target business application is participated. Specifically, when a test request for a target service application is received, the test plug-in may be used to call and generate test message data according to a preset test case file; determining a target server with a target service application deployed according to a preset test case file through the test plug-in, and further sending the test message data to the target server; and receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, comparing the test response message data by using a preset test case file to obtain comparison processing, and finally determining a corresponding test result according to the comparison processing result. Therefore, the technical problems that the testing process is complicated and complex when the target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method are solved. The technical effects of simplifying user operation and automatically and efficiently testing the target service application are achieved. The test plug-in is different from third-party test software support extension and secondary development, and can be modified and adjusted in cooperation with the change of test requirements.
Based on the thought, the embodiment of the application provides a method for testing service application. Specifically, please refer to a processing flow chart of a method for testing a service application according to an embodiment of the present application shown in fig. 1. The method for testing the service application provided by the embodiment of the application can be specifically applied to a test server (or other relevant test equipment in charge of the service application test) in charge of the service application test. In particular implementations, the method may include the following.
S101: receiving and responding a test request aiming at the target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language.
In this embodiment, the target service application may be specifically understood as a service application to be tested. Specifically, the target business application may be a mobile phone APP or a computer application program to be tested, or may be a widget or an applet integrated in the mobile phone APP or the computer application program and providing some business services. Of course, the type of targeted business application listed above is merely illustrative. In specific implementation, the target service application may further include other types of service applications according to a specific application scenario. The present specification is not limited to these.
In this embodiment, the target service application may specifically be an APP of a mobile banking, may also be a functional module integrated in the APP of the mobile banking and used for providing a financial recommendation service for a user, and may also be a small plug-in that is integrated in software such as a WeChat and a QQ and is used for providing services such as online account balance inquiry and bank account login for the user. Of course, the specific target business applications listed above are only illustrative. In specific implementation, the target service application of other content and functions may also be included according to a specific application scenario. Specifically, the target service application may be a service that needs to be processed online, such as an online transaction service or an online query service.
In this embodiment, the test plug-in may be specifically understood as a plug-in tool for performing a specific service application test, which utilizes the existing functional characteristics of Excel (a common office software supporting form processing), performs logic control through the VBA (Visual Basic for applications) language of Excel itself, and can support functions such as online communication. Exe can be specifically noted as senddata. The test plug-in may be written in C language (one-door procedure-oriented abstract general programming language) and supports C language development.
In this embodiment, the test request for the target service application may be specifically understood as request data for performing an online test on the target service application by using the request. The test request of the target service application may specifically carry identification information of the target service application to be tested.
In this embodiment, the preset test case file may be specifically understood as a related file prepared in advance and used for performing the business application test. The preset test case file corresponds to the service application to be tested, and may carry identification information of the corresponding service application.
Specifically, the preset test case file may specifically include at least one of the following: the system comprises a request message, a standard response message aiming at the request message, an IP address and port data of a target server for deploying target service application to be tested and the like. Of course, it should be noted that the various data contained in the preset test case file listed above is only an exemplary illustration. In specific implementation, the preset test case file may further include other types of data according to a specific application scenario and a test requirement. For example, the preset test case file may further include preset identification information indicating a feature field to be subjected to comparison processing, and the like.
In this embodiment, the test server may obtain and record the preset test case file in Excel data, for example, a second Sheet page in the Excel data, through the test plug-in.
In this embodiment, the request packet may be specifically understood as data that is sent to the target server for processing by the target server, for the target service application. After receiving the request message, the target server generates and feeds back a corresponding response message according to the request message based on the specific content in the request message and the built-in rule of the target service application.
The request message may specifically include a plurality of fields. The preset test case file may record and store the request message by recording sub-segment information, such as a field name, a field length, a field value, and the like, of each of a plurality of fields in the request message.
Specifically, for example, for a client login service application (a target service application to be tested), the corresponding request message may include two fields. Wherein, the field name of one field is 'user name', the field length is '20 bits', and the field value is 'Roc'. The other field has a field name of "user password", a field length of "20 bits", and a field value of "19990909".
Of course, it should be noted that the above listed request message is only an exemplary illustration. In specific implementation, the request message may also be data containing other contents according to a specific application scenario and a specific function of the target service application to be tested. The present specification is not limited to these.
In this embodiment, after receiving a test request for a target service initiated by a user (e.g., a tester of a service application, etc.), a test server may determine, according to identification information of the target service application, a preset test case file corresponding to the identification information, as a preset test case file for testing the target service application. Furthermore, the preset test case file can be called and obtained through the test plug-in, and test message data for testing can be generated according to the preset test case file.
The test message data may be specifically understood as a request message initiated to a target server for testing a target service application.
In this embodiment, in the specific implementation, the invoking generates the test packet data according to the preset test case file, and the specific implementation may include the following contents: the test server extracts a request message from a preset test case file; performing data analysis on the request message, and extracting field information in a plurality of characteristic fields contained in the request message from the request message; further, corresponding test message data can be generated in a first Sheet page in the Excel data according to field information in the characteristic fields.
In this embodiment, the characteristic field may be specifically understood as a field that needs to be filled with a corresponding attribute characteristic value.
In this embodiment, in specific implementation, the test server may automatically fill the corresponding feature fields in the first Sheet page in the Excel data through the test plug-in according to the field information in the extracted feature fields, so that the test message data may be quickly obtained.
Specifically, for example, in the case that the target service application is a client login service application, the generated test message data may include the following contents: "user name: roc ", and" user password: 19990909".
In this embodiment, the preset test case files may specifically include a plurality of different preset test case files, where one test case file in the plurality of preset test case files corresponds to one test situation, which is one test scenario. For example, the preset test case file 1 includes a request message for inputting a correct password and a standard response message for representing successful login for the request message. The preset test case file 2 includes a request message with a wrong password input and a standard response message for representing login failure aiming at the request message. In the specific test, at least one round of test can be performed on one test condition (or one test scene) by using one preset test case file.
In this embodiment, the preset test case file may be specifically obtained by a test plug-in advance and recorded in Excel data. Specifically, the test server may receive and respond to a request generated by a tester for a preset test case file, and read a request packet input by a user (e.g., the tester) through the test plug-in, and an IP address and service port data of a target server deployed with a target service application. And then, the test plug-in can be called, and the request message is sent to the target server according to the IP address and the service port data of the target server. And receiving a response message fed back by the target server aiming at the request message through the test plug-in. In addition, a judgment instruction aiming at the response message can be received through a test plug-in, and whether the response message is a standard response message meeting the preset requirement or not is determined according to the judgment instruction; and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file in a second Sheet in Excel data according to the request message, the standard response message, the IP address of the target server and the service port data. Thereby creating and recording the one preset test case file.
In this embodiment, preset test case files corresponding to a plurality of test situations or test scenarios may be obtained and recorded in the above manner, so that a plurality of preset test case files for testing may be accumulated in Excel data before a target service is tested.
S102: and determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server.
In this embodiment, in specific implementation, the test server may determine, by the test plug-in, a server deployed with the target service application according to the IP address, the service port data, and the like of the target server recorded in the preset test case file, and use the server as the target server participating in the target service application test.
In this embodiment, in specific implementation, the test server may first establish a communication connection with the target server through the test plug-in according to the IP address and the service port data of the target server.
In this embodiment, after the communication connection is established, the test server may further detect whether the established communication connection with the target server is normally connected through the test plug-in.
Specifically, the test server may send the test signal to the target server through the test plug-in via the communication connection. And determining whether the communication connection is normally connected or not by detecting whether a reply signal fed back by the target server aiming at the test signal is received through the communication connection within a preset waiting time or not.
And under the condition that the communication connection is determined to be normally communicated, the test server can send the test message data to the target server through the communication connection.
And under the condition that the communication connection is determined not to meet the normal communication, prompt information can be generated so as to adjust the communication connection between the test plug-in and the target server according to the prompt information, so that the communication connection between the test plug-in and the target server returns to the normal communication condition.
S103: and receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result.
In this embodiment, after receiving the test packet data, the target server may obtain specific content included in the test packet data through data analysis. Further, the test server may perform corresponding data processing according to specific content included in the test packet data based on a processing logic or rule of the target application, and further generate test response packet data for the test packet data.
The test response message data may be specifically understood as a response message for testing, which is obtained by performing corresponding data processing based on processing logic or rules applied by the target service for specific contents included in the test message data, and is directed to the test message data.
Specifically, for example, in a case that the target service application is a client login service application, the target server may first perform, according to the processing logic of the service application, a test message including a "user name: roc, inquiring a user information database, and finding a user password matched with the Roc; and then the 'user password' contained in the test message data can be matched with the inquired user password.
If the match passes, a "return code" may be generated, for example, that includes information indicating that the password match was successful: 0000000 ", and" client name: wangceng' for characterizing the successful logging test response message data.
If the match does not pass, a "return code" may be generated, for example, that includes an indication that the password match failed: the return code field of ERROR ' and the ERROR information field of ' password ERROR ' for representing the information of client login failure, and the test response message data for representing the login failure.
In this embodiment, after the target server generates corresponding test response packet data according to the specific content included in the test packet data in the manner described above, the test response packet data may be sent to the test plug-in through the communication connection between the test plug-in and the target server.
In this embodiment, the test server may receive and obtain test response message data fed back by the target server for the test message data through the test plug-in.
In this embodiment, after obtaining the test response message data, the test server may compare the test response message data with the test plug-in according to the preset test case file.
In this embodiment, in specific implementation, the test server may extract the corresponding standard response message from the preset test case file through the test plug-in. Further, according to the preset identification information indicating the characteristic fields to be compared, the corresponding field information of the characteristic fields can be respectively extracted from the standard response message and the test response message data. And then, the field information of the same characteristic field extracted from the data of the standard response message and the test response message can be compared with the similarity to obtain a corresponding comparison result. Then, a corresponding comparison processing result can be generated according to the comparison result.
In this embodiment, in specific implementation, if it is determined that the similarity between the test response packet data and the field information of the same feature field extracted from the standard response packet is greater than or equal to a preset similarity threshold according to the comparison result, it may be determined that the target service application of the target server feeds back the standard response packet corresponding to the test packet data as expected, which indicates that the target service application of the target server passes the test for the round, and at this time, a record file may be generated as a corresponding comparison processing result. The record sheet file may be specifically understood as a preset test case file recorded with a test pass, and data that a target service application for a target server successfully tests the preset test case file is identified.
In this embodiment, in specific implementation, if it is determined that the similarity between the test response packet data and the field information of the same feature field extracted from the standard response packet is smaller than the preset similarity threshold according to the comparison result, it may be determined that the target service application of the target server does not feed back the standard response packet corresponding to the test packet data as expected, which indicates that the target service application of the target server fails in the test of the round, and at this time, a question list file may be generated as a corresponding comparison processing result. The problem list file may be specifically understood as a preset test case file in which a test failure is recorded, and data indicating that a target service application for a target server fails to test the preset test case file is identified.
In this embodiment, in specific implementation, the comparison processing result may be recorded in Excel data. Specifically, a single file of the problem obtained through the test or a single file of the record can be recorded and stored in the first Sheet page, so that the test response message data and the standard response message in the preset test case file can be conveniently and directly compared field by field in the first Sheet page, and a user (for example, a tester) can timely and clearly know the difference between the test response message data which fails in the test and the standard response message data.
S104: and determining a test result aiming at the target service application according to the comparison processing result.
In this embodiment, in specific implementation, the test server may determine the overall test condition of the target service application according to the comparison result, so as to obtain a test result corresponding to the target service application.
In this embodiment, specifically, for example, the test server may synthesize a plurality of corresponding comparison results obtained based on a plurality of preset test case files to determine that the target service application on the target server is tested successfully for some test situations (or test scenarios), for example, the case where the password is correctly input and the test fails, and for other test situations (or test scenarios), for example, the case where the password is incorrectly input and the test succeeds. And then, the overall test condition of the target application of the target server can be evaluated by combining the above conditions to obtain a final test result.
In this embodiment, after determining the test result for the target service application according to the comparison processing result, when the method is implemented specifically, the following may be further included: and adjusting the target service application and/or the target server according to the test result.
In this embodiment, the test result may be used as a reference guide to specifically perform corresponding adjustment or debugging on the target service application and the server deploying the target service application, so that the test can be passed when the subsequent test is performed based on the adjusted target service application and the adjusted target server.
In this embodiment, under the condition that the target service application of the target server is determined to be successfully tested in each test condition or test scenario according to the test result, the target service application of the target server can be determined to be a target service application which is reliable and meets the requirements. And then the target service application can be prepared to be formally released to the outside on line for the user to download and use.
In the embodiment of the application, compared with the existing method, the lightweight plug-in based on Excel and logically controlled by VBA language is introduced and utilized as the test plug-in to participate in the test aiming at the target service application, the existing functions and characteristics of Excel are fully utilized, and specifically, when the test request aiming at the target service application is received, the test plug-in is used for calling and generating the test message data according to the preset test case file; determining a target server with a target service application deployed according to a preset test case file through the test plug-in, and further sending the test message data to the target server; and receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, comparing the test response message data by using a preset test case file to obtain comparison processing, and finally determining a corresponding test result according to the comparison processing result. Therefore, the technical problems that the testing process is complicated and complex when the target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method are solved. The technical effects of simplifying user operation and automatically and efficiently testing the target service application are achieved.
In an embodiment, the preset test case file may specifically include at least one of: the system comprises a request message, a standard response message aiming at the request message, an IP address and service port data of a target server deployed with target service application and the like. Of course, the data contained in the preset test case file listed above is only an illustrative example. In specific implementation, the preset test case file may further include data of other types and contents according to specific situations and processing requirements.
In an embodiment, the invoking and generating test message data according to a preset test case file may include the following steps: analyzing and extracting field information of a plurality of characteristic fields contained in the request message in the preset test case file; and generating the test message data in a first Sheet page according to the field information of the plurality of characteristic fields contained in the request message.
In an embodiment, the preset test case file may specifically include: and preset identification information indicating the characteristic fields to be compared and the like.
In an embodiment, the comparing the response message data according to the preset test case file to obtain a corresponding comparison result, and the specific implementation may include the following contents: according to the identification information, extracting field information of the same characteristic field from the test response message data and the standard response message respectively to carry out similarity comparison to obtain corresponding comparison results; and generating a corresponding comparison processing result according to the comparison result.
In an embodiment, the generating a corresponding comparison processing result according to the comparison result may include the following steps: and generating a record list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be greater than or equal to a preset similarity threshold according to the comparison result.
In an embodiment, the generating a corresponding comparison processing result according to the comparison result may further include the following steps in specific implementation: and generating a problem list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be smaller than a preset similarity threshold according to the comparison result.
In one embodiment, before receiving and responding to the test request for the target business application, when the method is implemented, the following may be further included to accumulate the preset test case file for the target business application test. In particular, as shown in fig. 2.
S1: and receiving and responding to a preset generation request of the test case file, reading a request message input by a user through the test plug-in, and deploying the IP address and the service port data of the target server with the target service application.
S2: and calling the test plug-in, and sending the request message to a target server according to the IP address and the service port data of the target server.
S3: and receiving a response message fed back by the target server aiming at the request message through the test plug-in.
S4: receiving a judgment instruction aiming at the response message, and determining whether the response message is a standard response message meeting the preset requirement according to the judgment instruction.
S5: and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file in a second Sheet page according to the request message, the standard response message, the IP address of the target server and the service port data.
In this embodiment, the determination instruction may be from a determination server or a technician. The judgment server or a technician can perform multidimensional evaluation on the content, format and the like in the response message according to a preset judgment rule generated based on the processing logic or rule of the target service application to determine whether the response message is a correct response message which is expected to be met by the previously sent request message, namely the standard response message meeting the preset requirement.
In this embodiment, when it is determined that the response packet is a standard response packet meeting the preset requirement, the standard response packet, a request packet corresponding to the standard response packet, an IP address of a target server deployed with a target service application, and service port data may be combined to obtain a preset test case file corresponding to a test situation or a test scenario based on the request packet.
In this embodiment, the preset test case file may be further recorded and stored in the second Sheet page of the Excel data, which may facilitate the calling and use of the test plug-in.
In an embodiment, after determining the test result for the target service application according to the comparison processing result, when the method is implemented, the following may be further included: and adjusting the target service application and/or the target server according to the test result.
In this embodiment, in order to enable the target service application issued by the final upper limit to operate normally according to expectation in specific implementation, parameters in the target service application may be modified and adjusted in a targeted manner according to a test result obtained through a service application test; and the target server for deploying the target service application can be debugged and the like in a targeted manner according to the test result, so that the adjusted target service application deployed on the target server can be accurately processed as expected, and the request message sent by the user is fed back, so that the user can obtain better use experience.
From the above description, it can be seen that the service application testing method provided in the embodiment of the present application introduces and utilizes a lightweight plug-in based on Excel and performing logic control through VBA language as a test plug-in to participate in a test for a target service application, fully utilizes existing functions of Excel, and specifically, when a test request for the target service application is received, generates test message data by calling through the test plug-in and according to a preset test case file; determining a target server with a target service application deployed according to a preset test case file through the test plug-in, and further sending the test message data to the target server; and receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, comparing the test response message data by using a preset test case file to obtain comparison processing, and finally determining a corresponding test result according to the comparison processing result. Therefore, the technical problems that the testing process is complicated and complex when the target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method are solved. The technical effects of simplifying user operation and automatically and efficiently testing the target service application are achieved. The lightweight plug-in based on Excel and logically controlled by VBA language is introduced to serve as the test plug-in, so that the test of specific service application is realized, the test software of a third party is prevented from being introduced in the test process, the influence of the test software of the third party on the data safety of service data is avoided, meanwhile, the light-weight test plug-in is used, the processing burden of a system can be reduced, and the test efficiency is further improved. Acquiring a response message fed back by the target server aiming at the request message by using the test plug-in according to the request message input by the user, and the IP address and the service port data of the target server deployed with the target service application; and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file according to the request message, the standard response message, the IP address of the target server and the service port data. Therefore, the accumulation of the test cases can be automatically and efficiently realized, the user operation is further simplified, and the test efficiency is improved.
Based on the same inventive concept, the embodiment of the present application further provides a testing apparatus for service applications, as described in the following embodiments. Because the principle of the test device for the service application for solving the problems is similar to the test method for the service application, the implementation of the test device for the service application can refer to the implementation of the test method for the service application, and repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated. Please refer to fig. 3, which is a structural diagram of a testing apparatus for business applications provided in an embodiment of the present application, where the apparatus may specifically include: the generating module 301, the first determining module 302, the comparing processing module 303, and the second determining module 304 are described in detail below.
The generating module 301 may be specifically configured to receive and respond to a test request for a target service application, and call and generate test packet data according to a preset test case file through a test plug-in, where the test plug-in includes a plug-in based on Excel and performing logic control through VBA language;
the first determining module 302 is specifically configured to determine, by a test plug-in according to the preset test case file, a target server deployed with a target service application, and send the test packet data to the target server;
the comparison processing module 303 may be specifically configured to receive test response packet data, which is fed back by the target server for the test packet data, through the test plug-in, and compare the test response packet data according to the preset test case file to obtain a corresponding comparison processing result;
the second determining module 304 may be specifically configured to determine a test result for the target service application according to the comparison processing result.
In an embodiment, the preset test case file may specifically include at least one of: the system comprises a request message, a standard response message aiming at the request message, an IP address and service port data of a target server deployed with target service application and the like.
In one embodiment, in order to invoke and generate test message data according to a preset test case file, in a specific implementation, the generation module 301 may be specifically configured to analyze and extract field information of a plurality of feature fields included in a request message in the preset test case file; and generating the test message data in a first Sheet page according to the field information of the plurality of characteristic fields contained in the request message.
In an embodiment, the preset test case file may specifically include: and preset identification information indicating the characteristic fields to be compared and the like.
In one embodiment, in order to perform comparison processing on the response message data according to the preset test case file to obtain a corresponding comparison processing result, in a specific implementation, the comparison processing module 303 may be specifically configured to extract field information of the same characteristic field from the test response message data and the standard response message respectively according to the identification information to perform similarity comparison, so as to obtain a corresponding comparison result; and generating a corresponding comparison processing result according to the comparison result.
In an embodiment, when the comparison processing module 303 is implemented specifically, it may further generate a record list file as a corresponding comparison processing result when it is determined that the similarity of the field information of the same feature field extracted from the test response packet data and the standard response packet is greater than or equal to a preset similarity threshold according to the comparison result.
In an embodiment, when the comparison processing module 303 is implemented specifically, it may further generate a question sheet file as a corresponding comparison processing result when it is determined that the similarity of the field information of the same feature field extracted from the test response packet data and the standard response packet is smaller than a preset similarity threshold according to the comparison result.
In an embodiment, the apparatus may further include an accumulation module, which may be specifically configured to receive and respond to a preset request for generating a test case file, read, through a test plug-in, a request packet input by a user, and an IP address and service port data of a target server to which a target service application is deployed; calling the test plug-in, and sending the request message to a target server according to the IP address and the service port data of the target server; receiving a response message fed back by the target server aiming at the request message through the test plug-in; receiving a judgment instruction aiming at the response message, and determining whether the response message is a standard response message meeting the preset requirement according to the judgment instruction; and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file in a second Sheet page according to the request message, the standard response message, the IP address of the target server and the service port data.
In an embodiment, the apparatus may further include an adjusting module, which may be specifically configured to adjust the target service application and/or the target server according to the test result.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It should be noted that, the systems, devices, modules or units described in the above embodiments may be implemented by a computer chip or an entity, or implemented by a product with certain functions. For convenience of description, in the present specification, the above devices are described as being divided into various units by functions, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
Moreover, in the subject specification, adjectives such as first and second may only be used to distinguish one element or action from another element or action without necessarily requiring or implying any actual such relationship or order. References to an element or component or step (etc.) should not be construed as limited to only one of the element, component, or step, but rather to one or more of the element, component, or step, etc., where the context permits.
From the above description, it can be seen that, according to the test device for service application provided in the embodiment of the present application, since the lightweight plug-in based on Excel and performing logic control through VBA language is introduced and utilized as the test plug-in to participate in the test for the target service application, existing functions of Excel are fully utilized, and specifically, when the test request for the target service application is received through the generation module, the test plug-in is called and test message data is generated according to a preset test case file; determining a target server with a target service application deployed through the first determining module according to a preset test case file through the test plug-in, and further sending the test message data to the target server; and receiving test response message data fed back by the target server aiming at the test message data through the comparison processing module through the test plug-in, performing comparison processing by using a preset test case file to obtain comparison processing, and finally determining a corresponding test result according to the comparison processing result through the second determination module. Therefore, the technical problems that the testing process is complicated and complex when the target service application is tested, the testing efficiency is low, and automatic testing cannot be realized in the existing method are solved. The technical effects of simplifying user operation and automatically and efficiently testing the target service application are achieved.
The embodiment of the present application further provides a server, which may specifically refer to a schematic structural diagram of a server based on the service application testing method provided in the embodiment of the present application shown in fig. 4, where the server may specifically include an input device 41, a processor 42, and a memory 43. The input device 41 may be specifically used for a test request for a target service application. The processor 42 may be specifically configured to receive and respond to a test request for a target service application, and call and generate test packet data according to a preset test case file through a test plug-in, where the test plug-in includes a plug-in based on Excel and performing logic control through VBA language; determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server; receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result; and determining a test result aiming at the target service application according to the comparison processing result. The memory 43 may be used in particular for storing a corresponding instruction program.
In this embodiment, the input device may be one of the main apparatuses for information exchange between a user and a computer system. The input device may include a keyboard, a mouse, a camera, a scanner, a light pen, a handwriting input board, a voice input device, etc.; the input device is used to input raw data and a program for processing the data into the computer. The input device can also acquire and receive data transmitted by other modules, units and devices. The processor may be implemented in any suitable way. For example, the processor may take the form of, for example, a microprocessor or processor and a computer-readable medium that stores computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, an embedded microcontroller, and so forth. The memory may in particular be a memory device used in modern information technology for storing information. The memory may include multiple levels, and in a digital system, the memory may be any memory as long as it can store binary data; in an integrated circuit, a circuit without a physical form and with a storage function is also called a memory, such as a RAM, a FIFO and the like; in the system, the storage device in physical form is also called a memory, such as a memory bank, a TF card and the like.
In this embodiment, the functions and effects specifically realized by the server can be explained in comparison with other embodiments, and are not described herein again.
An embodiment of the present application further provides a computer storage medium of a test method based on a service application, where the computer storage medium stores computer program instructions, and when the computer program instructions are executed, the computer program instructions implement: receiving and responding a test request aiming at a target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language; determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server; receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result; and determining a test result aiming at the target service application according to the comparison processing result.
In the present embodiment, the storage medium includes, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Cache (Cache), a Hard disk (HDD), or a Memory Card (Memory Card). The memory may be used to store computer program instructions. The network communication unit may be an interface for performing network connection communication, which is set in accordance with a standard prescribed by a communication protocol.
In this embodiment, the functions and effects specifically realized by the program instructions stored in the computer storage medium can be explained by comparing with other embodiments, and are not described herein again.
In a specific scenario example, the service application testing method provided in the embodiment of the present application may be applied, and a B server (i.e., a testing server) is utilized to perform a specific test on a client login system (i.e., a target service application) deployed on an a server (i.e., a target server). The following can be referred to for a specific test procedure. Wherein, the B server is provided with a plug-in (namely a test plug-in) based on an Excel tool.
In this scenario, the server a has its own IP address and service port, and can send a socket request message from the server B to the service port of the server a, and after receiving the request message, the server a returns a corresponding response message to the server B through logic processing.
For the client login system used for testing, the format of the request message can be expressed as follows: the user name (20 bits) + the user password (20 bits) (corresponding to a plurality of fields), and the content of the corresponding request message can be expressed as: everwit +19990909 (field values corresponding to a plurality of fields).
Usually, after receiving the request message, the server a performs a verification first, and returns the client information after the login is successful if the user name and password are verified correctly; otherwise, returning login failure.
Specifically, the format of the response message indicating successful login returned by the server a may be represented as: the return code (0000000 represents success) + client name (20 bits) + client login times (8 bits) + total client online duration (8 bits), and correspondingly, the content of the response message indicating successful login can be represented as: 0000000+ Roc +15+ 513. The format of the response message indicating login failure returned in failure may be represented as: the error code (7 bits) + the error information (100 bits), and the corresponding content of the response message indicating the login failure can be expressed as: ERROR01+ does not have the username or password ERROR.
As a tester, at least two cases (corresponding to two test cases or test scenarios) need to be tested during a specific test: one is whether to return a response message indicating successful login under the condition that the user name and the password are correct; one is whether to return a response message indicating a login failure in the case where the user name and password are incorrect.
When testing the first case, the tester can input the correct user name and password, if the server returns a response message indicating successful login, the case is expected, at this time, the tester selects and stores the successful case in the Excel data, and the tool plug-in arranged on the server B can automatically store the IP address and the port of the server A and the format and content of the request message and the response message (as the standard response message in the test scene) in a file (to obtain a preset test case file corresponding to the test condition or the test scene). Note that: because the login times and the total online time in the response message returned after successful login are dynamically changed, the two fields need to be set as not-to-be-compared (i.e. identification information to be compared is not set for the two fields), and the return code and the client name need to be set as to-be-compared (i.e. identification information to be compared is set for the two fields) because the content is fixed.
Similarly, when testing the second case, the tester may input an incorrect user name or password, and if the server returns an ERROR code of ERROR01, it is expected that the tester selects a case that is successfully saved in Excel, and the tool plugin installed on the B server will automatically save the IP address and port of the a server, and the format and content of the request message and the response message (as a standard response message in the test scenario) in a file (to obtain another preset test case file). Note that: two fields of the returned message need to be set to need to be compared because the contents are fixed.
Case accumulation work can be completed in the above manner (two different preset test case files corresponding to two test scenes are obtained).
When the second round of test (namely, specific test is carried out on the target service application) or after the next iterative development and whether the original functions are normally used needs to be verified, the accumulated cases, namely, the second part of functions of the tool plug-in can be automatically executed: automated execution of the case (i.e., automated testing of the target business application).
Firstly, selecting and importing a case to be tested in an Excel tool, importing two case names stored on the Excel tool into the Excel, clicking to execute, automatically reading contents in the stored case by a tool plugin, sending a generated request message to a server A, analyzing the received response message into each field of the response message after receiving a response message fed back by the server A, comparing (namely comparing and processing) field contents needing to be compared, namely comparing an expected result stored in a case with the field contents of the received response message, and if the expected result and the field contents of the received response message are completely consistent, identifying that the automatically executed case is successfully executed to form a record list (namely a comparison and processing result); otherwise, the mark is failure, and a question list is formed. Wherein, the content of the record list is a sending message and an actual response message. The contents of the question list are a sending message, an expected response message and an actual response message, so that a test or a developer can conveniently check the problems according to the question list.
In the scene example, when case accumulation is specifically performed, the tool can automatically analyze the message into each field of the response message after the return message is received by filling each field value of the request message in Excel and clicking a 'sending' button, and if the message is in accordance with the expectation, clicking a 'case saving' button, and completely saving the scene of the case by the tool plug-in. The method comprises the steps of requesting a message, wherein the message comprises a field name, a field length and a field value of the message; the field name, the field length, the comparison requirement of the flags and the field value of the response message. A case scenario corresponds to a text file, and the case name file name can be manually specified by the user. The specific implementation process can be seen in fig. 5.
The format of the generated automatic test case file may include: the total byte number (5 bits) + IP, port, byte number of a request message length field, time-lapse second number, 8583 message header type, 8583 message type $ $ request message field 1 name | | field 1 type | |8583 field number | |8583 field type | | field 1 length | | alignment type | | complement character%%% of related information of the request message field 2 | | | | | h ###################################### response message field 1 name | | | | needs to be compared with whether the flag | | field 1 type | | field 1 length | | | | field 2 type | | | field 2 length%% # l | | l is needed to be compared with flag | | | | | | field 2 type | | | field 2 length%% ############|.
The explanation concerning the field segmenter is as follows. The delimiter of the' ip port and interface description: AddressAndFrameSep $ $ $ $ $ $ $,; ' interface description and delimiters for sending message content: FrameAndSendSep ═ & & & & & & & & & & & &, "; ' interface describes the internal send and receive delimiters: frameseparator ═ # #####"; ' a divider for each row within the interface description; framelinesearator ═ percent (%)%; ' delimiters of the various fields within the interface description: frameitemseparator ═ l | ".
In this scenario example, the execution of the specific automation case may include the following: clicking a case importing button, leading the saved case names (the cases accumulated in the step 1) in the specified folder into a Sheet to be executed (namely a second Sheet page) by the tool plug-in, clicking an executing starting button, executing the automatic cases one by the tool, judging the result, displaying the case execution success if the result meets the expectation, otherwise displaying the case execution failure, and correspondingly generating a record Sheet or a question Sheet. Whether the received message is in expectation or not is judged according to whether the received message is the same as the message stored in the case in the field set to be compared or not. The specific implementation process can be seen in fig. 6.
In this scenario example, the generated record Sheet and the generated question Sheet are stored as Excel files, specifically, the upper half of the Sheet (i.e., corresponding to the first Sheet page) is a field description and a field value of the sent message, the lower half is a field description and an actual value and an expected result value of the response message, the actual value and the expected result value are displayed in a comparison manner, if the fields are fields needing to be compared and the field values are different, the fields are marked in red, and the fields which are visually shown to the testers are failed in comparison, so that the execution of the automation case fails. The specific implementation process can be seen in fig. 7.
In the scenario example, the tool plug-in can complete the accumulation of the automated test cases in Excel, including the organization, sending, receiving, displaying and field storage (case accumulation) of messages. And completing the execution of the automatic test case in the Excel, wherein the execution of the case comprises the introduction of the case, the execution of the case, the display of an execution result (success/failure), the comparison of an actual result and an expected result, the modification of the case, the generation of a record sheet and the generation of a question sheet.
The tool plug-in used in the method is developed based on Excel and VBA languages, and has the advantages of installation-free property, light weight and secondary development capability. The method is simple to operate, and can finish execution of the automatic case, judgment of the test result and automatic generation of the record sheet and the question sheet by one key.
Through the scene example, it is verified that the lightweight plug-in based on Excel and logically controlled through VBA language is introduced and utilized as the test plug-in to participate in the test for the target service application, so that the existing functions of Excel are fully utilized, and specifically, when a test request for the target service application is received, the test plug-in is used for calling and generating test message data according to a preset test case file; determining a target server with a target service application deployed according to a preset test case file through the test plug-in, and further sending the test message data to the target server; and receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, comparing the test response message data by using a preset test case file to obtain comparison processing, and finally determining a corresponding test result according to the comparison processing result. The method can solve the technical problems that the testing process is complicated and complicated when the target service application is tested, the testing efficiency is low, and the automatic testing cannot be realized in the existing method. The technical effects of simplifying user operation and automatically and efficiently testing the target service application can be really achieved.
Although various specific embodiments are mentioned in the disclosure of the present application, the present application is not limited to the cases described in the industry standards or the examples, and the like, and some industry standards or the embodiments slightly modified based on the implementation described in the custom manner or the examples can also achieve the same, equivalent or similar, or the expected implementation effects after the modifications. Embodiments employing such modified or transformed data acquisition, processing, output, determination, etc., may still fall within the scope of alternative embodiments of the present application.
Although the present application provides method steps as described in an embodiment or flowchart, more or fewer steps may be included based on conventional or non-inventive means. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an apparatus or client product in practice executes, it may execute sequentially or in parallel (e.g., in a parallel processor or multithreaded processing environment, or even in a distributed data processing environment) according to the embodiments or methods shown in the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded.
The devices or modules and the like explained in the above embodiments may be specifically implemented by a computer chip or an entity, or implemented by a product with certain functions. For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, in implementing the present application, the functions of each module may be implemented in one or more pieces of software and/or hardware, or a module that implements the same function may be implemented by a combination of a plurality of sub-modules, and the like. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may therefore be considered as a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, classes, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes several instructions for enabling a computer device (which may be a personal computer, a mobile terminal, a server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same or similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable electronic devices, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
While the present application has been described by way of examples, those of ordinary skill in the art will appreciate that there are numerous variations and permutations of the present application that do not depart from the spirit of the present application and that the appended embodiments are intended to include such variations and permutations without departing from the present application.

Claims (12)

1. A method for testing a business application, comprising:
receiving and responding a test request aiming at a target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language;
determining a target server with a target service application deployed according to the preset test case file through a test plug-in, and sending the test message data to the target server;
receiving test response message data fed back by a target server aiming at the test message data through a test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result;
and determining a test result aiming at the target service application according to the comparison processing result.
2. The method of claim 1, wherein the preset test case file comprises at least one of: the system comprises a request message, a standard response message aiming at the request message, an IP address of a target server deployed with target service application and service port data.
3. The method of claim 2, wherein invoking and generating test message data according to a preset test case file comprises:
analyzing and extracting field information of a plurality of characteristic fields contained in the request message in the preset test case file;
and generating the test message data in a first Sheet page according to the field information of the plurality of characteristic fields contained in the request message.
4. The method of claim 2, wherein the preset test case file further comprises: and preset identification information indicating the characteristic fields to be compared.
5. The method of claim 4, wherein comparing the response message data according to the preset test case file to obtain a corresponding comparison result comprises:
according to the identification information, extracting field information of the same characteristic field from the test response message data and the standard response message respectively to carry out similarity comparison to obtain corresponding comparison results;
and generating a corresponding comparison processing result according to the comparison result.
6. The method of claim 5, wherein generating a corresponding comparison processing result according to the comparison result comprises:
and generating a record list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be greater than or equal to a preset similarity threshold according to the comparison result.
7. The method of claim 6, wherein generating a corresponding comparison processing result according to the comparison result further comprises:
and generating a problem list file as a corresponding comparison processing result when the similarity of the field information of the same characteristic field extracted from the test response message data and the standard response message is determined to be smaller than a preset similarity threshold according to the comparison result.
8. The method of claim 1, wherein before receiving and responding to a test request for a target business application, the method comprises:
receiving and responding to a preset generation request of a test case file, reading a request message input by a user, and an IP address and service port data of a target server with target service application deployed through a test plug-in;
calling the test plug-in, and sending the request message to a target server according to the IP address and the service port data of the target server;
receiving a response message fed back by the target server aiming at the request message through the test plug-in;
receiving a judgment instruction aiming at the response message, and determining whether the response message is a standard response message meeting the preset requirement according to the judgment instruction;
and under the condition that the response message is determined to be a standard response message meeting the preset requirement, establishing and storing a preset test case file in a second Sheet page according to the request message, the standard response message, the IP address of the target server and the service port data.
9. The method according to claim 1, wherein after determining a test result for the target service application according to the comparison processing result, the method further comprises:
and adjusting the target service application and/or the target server according to the test result.
10. A device for testing a business application, comprising:
the generating module is used for receiving and responding to a test request aiming at the target service application, calling and generating test message data according to a preset test case file through a test plug-in, wherein the test plug-in comprises a plug-in which is based on Excel and is logically controlled through a VBA language;
the first determining module is used for determining a target server with a target service application according to the preset test case file through a test plug-in and sending the test message data to the target server;
the comparison processing module is used for receiving test response message data fed back by the target server aiming at the test message data through the test plug-in, and comparing the test response message data according to the preset test case file to obtain a corresponding comparison processing result;
and the second determining module is used for determining a test result aiming at the target service application according to the comparison processing result.
11. A server comprising a processor and a memory for storing processor-executable instructions, wherein the processor, when executing the instructions, performs the steps of the method of any one of claims 1 to 9.
12. A computer-readable storage medium having computer instructions stored thereon which, when executed, implement the steps of the method of any one of claims 1 to 9.
CN201911409550.8A 2019-12-31 Service application testing method, device, server and storage medium Active CN111177005B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911409550.8A CN111177005B (en) 2019-12-31 Service application testing method, device, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911409550.8A CN111177005B (en) 2019-12-31 Service application testing method, device, server and storage medium

Publications (2)

Publication Number Publication Date
CN111177005A true CN111177005A (en) 2020-05-19
CN111177005B CN111177005B (en) 2024-04-16

Family

ID=

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586078A (en) * 2020-06-12 2020-08-25 上海通联金融服务有限公司 8583 message inline transaction simulation tool
CN111831561A (en) * 2020-06-28 2020-10-27 许昌开普检测研究院股份有限公司 Plug-in test case library system
CN111858379A (en) * 2020-07-30 2020-10-30 中国工商银行股份有限公司 Application testing method and device
CN112035530A (en) * 2020-09-17 2020-12-04 中国银行股份有限公司 Transaction message matching method and system in distributed real-time payment system
CN112165406A (en) * 2020-08-31 2021-01-01 苏宁云计算有限公司 Interface message testing method and device, computer equipment and storage medium
CN112416197A (en) * 2020-11-23 2021-02-26 山东师范大学 Excel and VBA-based program single-step demonstration device and method
CN112559316A (en) * 2020-09-03 2021-03-26 中国银联股份有限公司 Software testing method and device, computer storage medium and server
CN113110997A (en) * 2021-04-23 2021-07-13 中国工商银行股份有限公司 Test method, device and equipment
CN113225232A (en) * 2021-05-12 2021-08-06 中国第一汽车股份有限公司 Hardware testing method and device, computer equipment and storage medium
CN113312264A (en) * 2021-06-08 2021-08-27 中国农业银行股份有限公司 Business system test method, device, equipment, medium and product
CN114218556A (en) * 2021-12-23 2022-03-22 中国建设银行股份有限公司 Access authentication method, device, equipment, computer readable storage medium and product
CN114827310A (en) * 2022-04-26 2022-07-29 中国工商银行股份有限公司 Transaction clearing message transmission method and device
CN116991693A (en) * 2023-09-27 2023-11-03 宁波银行股份有限公司 Test method, device, equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120297367A1 (en) * 2011-05-19 2012-11-22 Verizon Patent And Licensing, Inc. Testing an application
CN104636254A (en) * 2015-01-16 2015-05-20 北京创毅视讯科技有限公司 Method and device for generating test case, testing method and testing equipment
CN106326108A (en) * 2016-08-09 2017-01-11 北京金山安全软件有限公司 New application testing method and device
CN106919503A (en) * 2016-11-15 2017-07-04 阿里巴巴集团控股有限公司 The method of testing and device of application program
CN107229481A (en) * 2017-07-19 2017-10-03 中国银行股份有限公司 A kind of testing method and tool based on Excel
CN107341109A (en) * 2017-07-07 2017-11-10 中国银行股份有限公司 The generation method and system of a kind of test data
CN107678935A (en) * 2017-05-10 2018-02-09 平安科技(深圳)有限公司 Data creation method, terminal and computer-readable recording medium
CN108282377A (en) * 2017-01-05 2018-07-13 菜鸟智能物流控股有限公司 Processing method and device for testing logistics service data and server
CN110414242A (en) * 2019-08-02 2019-11-05 中国工商银行股份有限公司 For detecting the method, apparatus, equipment and medium of service logic loophole

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120297367A1 (en) * 2011-05-19 2012-11-22 Verizon Patent And Licensing, Inc. Testing an application
CN104636254A (en) * 2015-01-16 2015-05-20 北京创毅视讯科技有限公司 Method and device for generating test case, testing method and testing equipment
CN106326108A (en) * 2016-08-09 2017-01-11 北京金山安全软件有限公司 New application testing method and device
CN106919503A (en) * 2016-11-15 2017-07-04 阿里巴巴集团控股有限公司 The method of testing and device of application program
CN108282377A (en) * 2017-01-05 2018-07-13 菜鸟智能物流控股有限公司 Processing method and device for testing logistics service data and server
CN107678935A (en) * 2017-05-10 2018-02-09 平安科技(深圳)有限公司 Data creation method, terminal and computer-readable recording medium
CN107341109A (en) * 2017-07-07 2017-11-10 中国银行股份有限公司 The generation method and system of a kind of test data
CN107229481A (en) * 2017-07-19 2017-10-03 中国银行股份有限公司 A kind of testing method and tool based on Excel
CN110414242A (en) * 2019-08-02 2019-11-05 中国工商银行股份有限公司 For detecting the method, apparatus, equipment and medium of service logic loophole

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586078A (en) * 2020-06-12 2020-08-25 上海通联金融服务有限公司 8583 message inline transaction simulation tool
CN111831561A (en) * 2020-06-28 2020-10-27 许昌开普检测研究院股份有限公司 Plug-in test case library system
CN111831561B (en) * 2020-06-28 2024-01-12 许昌开普检测研究院股份有限公司 Plug-in type test case library system
CN111858379A (en) * 2020-07-30 2020-10-30 中国工商银行股份有限公司 Application testing method and device
CN111858379B (en) * 2020-07-30 2024-03-29 中国工商银行股份有限公司 Application testing method and device
CN112165406A (en) * 2020-08-31 2021-01-01 苏宁云计算有限公司 Interface message testing method and device, computer equipment and storage medium
CN112165406B (en) * 2020-08-31 2022-09-20 苏宁云计算有限公司 Interface message testing method and device, computer equipment and storage medium
CN112559316A (en) * 2020-09-03 2021-03-26 中国银联股份有限公司 Software testing method and device, computer storage medium and server
CN112035530A (en) * 2020-09-17 2020-12-04 中国银行股份有限公司 Transaction message matching method and system in distributed real-time payment system
CN112035530B (en) * 2020-09-17 2023-11-21 中国银行股份有限公司 Transaction message matching method and system in distributed real-time payment system
CN112416197B (en) * 2020-11-23 2021-12-03 山东师范大学 Excel and VBA-based program single-step demonstration device and method
CN112416197A (en) * 2020-11-23 2021-02-26 山东师范大学 Excel and VBA-based program single-step demonstration device and method
CN113110997A (en) * 2021-04-23 2021-07-13 中国工商银行股份有限公司 Test method, device and equipment
CN113225232B (en) * 2021-05-12 2022-06-10 中国第一汽车股份有限公司 Hardware testing method and device, computer equipment and storage medium
CN113225232A (en) * 2021-05-12 2021-08-06 中国第一汽车股份有限公司 Hardware testing method and device, computer equipment and storage medium
CN113312264A (en) * 2021-06-08 2021-08-27 中国农业银行股份有限公司 Business system test method, device, equipment, medium and product
CN114218556A (en) * 2021-12-23 2022-03-22 中国建设银行股份有限公司 Access authentication method, device, equipment, computer readable storage medium and product
CN114827310A (en) * 2022-04-26 2022-07-29 中国工商银行股份有限公司 Transaction clearing message transmission method and device
CN114827310B (en) * 2022-04-26 2024-01-05 中国工商银行股份有限公司 Transaction clearing message transmission method and device
CN116991693A (en) * 2023-09-27 2023-11-03 宁波银行股份有限公司 Test method, device, equipment and storage medium
CN116991693B (en) * 2023-09-27 2023-12-26 宁波银行股份有限公司 Test method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107656874B (en) Interface testing method and device, simulation baffle and system
CN110221982B (en) Performance test method, device and equipment of business system and readable storage medium
CN110427331B (en) Method for automatically generating performance test script based on interface test tool
CN107436844B (en) Method and device for generating interface use case aggregate
CN108628748B (en) Automatic test management method and automatic test management system
CN108255725B (en) Test method and device
CN109815112B (en) Data debugging method and device based on functional test and terminal equipment
CN111797026A (en) Test case generation method and device, computer equipment and storage medium
CN110955409A (en) Method and device for creating resources on cloud platform
CN114546738A (en) Server general test method, system, terminal and storage medium
CN108809896A (en) A kind of information calibration method, device and electronic equipment
CN112395182A (en) Automatic testing method, device, equipment and computer readable storage medium
CN108280024B (en) Flow distribution strategy testing method and device and electronic equipment
CN111061637B (en) Interface testing method, interface testing device and storage medium
CN117499287A (en) Web testing method, device, storage medium and proxy server
CN112612706A (en) Automated testing method, computer device and storage medium
CN112181822A (en) Test method and test method for starting time consumption of application program
CN111177005B (en) Service application testing method, device, server and storage medium
CN107797917B (en) Performance test script generation method and device
CN111177005A (en) Service application testing method, device, server and storage medium
CN113407444B (en) Interface test case generation method, device, equipment and storage medium
CN110532186B (en) Method, device, electronic equipment and storage medium for testing by using verification code
CN113986747A (en) Data generation method and device, electronic equipment and storage medium
CN111597101A (en) SDK access state detection method, computer device and computer readable storage medium
CN113704123B (en) Interface testing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant