CN109165170B - A method and system for automated request testing - Google Patents

A method and system for automated request testing Download PDF

Info

Publication number
CN109165170B
CN109165170B CN201811205854.8A CN201811205854A CN109165170B CN 109165170 B CN109165170 B CN 109165170B CN 201811205854 A CN201811205854 A CN 201811205854A CN 109165170 B CN109165170 B CN 109165170B
Authority
CN
China
Prior art keywords
test
test case
case
result data
business process
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811205854.8A
Other languages
Chinese (zh)
Other versions
CN109165170A (en
Inventor
陈晓丽
范渊
龙文洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DBAPPSecurity Co Ltd
Original Assignee
DBAPPSecurity Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DBAPPSecurity Co Ltd filed Critical DBAPPSecurity Co Ltd
Priority to CN201811205854.8A priority Critical patent/CN109165170B/en
Publication of CN109165170A publication Critical patent/CN109165170A/en
Application granted granted Critical
Publication of CN109165170B publication Critical patent/CN109165170B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

本发明提供了一种自动化请求测试的方法及系统,涉及软件测试技术领域,应用于测试管理服务器,包括:获取待测试的测试用例;获取测试业务流程模型,并基于测试业务流程模型中携带的流程参数对各个测试用例执行测试操作,其中,流程参数包括测试用例的执行顺序和/或执行条件;在对测试用例执行测试操作的过程中,将各个测试用例的测试结果数据与其对应的预期结果数据进行对比;若比较成功,则按照测试业务流程模型对下一个测试用例执行测试操作。本发明保障了软件上线前所涉及的业务流程的测试顺畅执行,达到节省软件在上线前业务流程接口的测试时间的技术效果。

Figure 201811205854

The present invention provides a method and system for automatic request testing, which relate to the technical field of software testing and are applied to a test management server, including: obtaining a test case to be tested; Process parameters perform test operations on each test case, wherein the process parameters include the execution sequence and/or execution conditions of the test cases; in the process of performing the test operations on the test cases, the test result data of each test case and its corresponding expected results The data is compared; if the comparison is successful, the test operation is performed on the next test case according to the test business process model. The invention guarantees the smooth execution of the test of the business process involved before the software goes online, and achieves the technical effect of saving the testing time of the business process interface before the software goes online.

Figure 201811205854

Description

Method and system for automatic request test
Technical Field
The invention relates to the technical field of software testing, in particular to a method and a system for automatically requesting a test.
Background
The software testing technology is an important component in the software development process, and is used for discovering various problems existing in the software product, namely, parts inconsistent with user requirements and predefined parts, as soon as possible, and checking bugs of the software product, throughout the whole software development life cycle and the activity process of verifying and confirming the software product (including staged products).
The appearance of automatic test has practiced thrift the cost, has liberated the manpower, can let the machine carry out some repeated labours according to the plan simultaneously to avoid the tester because of the tired mood that the repeated labours produced, and then improved efficiency of software testing.
For the existing automatic testing technology, most of the automatic testing cases are automatically executed, so that the purposes of rapid testing and automatic testing are achieved, the testing results cannot be automatically compared and processed, and nodes with problems in a business flow link in a business scene cannot be interrupted in time and give an alarm, so that errors or bugs cannot be reported to related technicians in time, and the software development progress is delayed.
Disclosure of Invention
In view of the above, an object of the present invention is to provide a method and a system for automatically requesting a test, which are capable of facilitating and using test cases, ensuring smooth execution of a test on a business process related to software before online, and achieving a technical effect of saving test time of a business process interface before online.
In a first aspect, an embodiment of the present invention provides an automated test request method, applied to a test management server, including: obtaining a test case to be tested; acquiring a test service flow model, and executing test operation on each test case based on flow parameters carried in the test service flow model, wherein the flow parameters comprise an execution sequence and/or an execution condition of the test case; in the process of executing test operation on the test cases, comparing the test result data of each test case with the corresponding expected result data; and if the comparison is successful, executing the test operation on the next test case according to the test service flow model.
Further, executing the test operation on each test case includes: acquiring a uniform resource locator in the test case; sending a test request and test parameters in the test case to a target server through the uniform resource locator; and obtaining test result data fed back by the target server based on the test parameters.
Further, executing the test operation on the next test case according to the test service flow model includes: judging whether the next test operation of the test service flow model is a jump operation test or not according to the jump information in the test case; if the test is the jump operation test, executing the jump operation, and executing the test operation of the next test case according to the execution sequence under the condition that the execution of the jump operation is successful; if the execution of the skip operation fails, interrupting the test of the test service flow model and sending alarm information; wherein the alarm information comprises at least one of: recording test failure and test case information of the test failure; and if the test result is not the jump operation test, executing the test operation of the next test case according to the execution sequence.
Further, the method further comprises: if the comparison between the test result data of each test case and the corresponding expected result data fails, interrupting the test of the test service flow model and sending alarm information, wherein the alarm information comprises at least one of the following: test failure records and test case information when the test fails.
Further, the method further comprises: and storing the test record of each test case, the test result data of each test case and the comparison result between the test result data of each test case and the corresponding expected result data in the test management server in a database of the test management server.
In a first aspect, an embodiment of the present invention further provides an automated test request system, which is applied to a test management platform, and includes: the device comprises an acquisition module, an execution module, a comparison module and a result processing module, wherein the acquisition module is used for acquiring a test case to be tested; the execution module is used for acquiring a test service flow model and executing test operation on each test case based on flow parameters carried in the test service flow model, wherein the flow parameters comprise an execution sequence and/or an execution condition of the test case; the comparison module is used for comparing the test result data of each test case with the corresponding expected result data in the process of executing the test operation on the test cases; and the result processing module is used for executing the test operation on the next test case according to the test service flow model if the comparison is successful.
Further, the execution module includes: the device comprises an acquisition unit, a sending unit and a receiving unit, wherein the acquisition unit is used for acquiring the uniform resource locator in the test case; the sending unit is used for sending the test request and the test parameters in the test case to a target server through the uniform resource locator; and the receiving unit is used for acquiring the test result data fed back by the target server based on the test parameters.
Further, if the comparison is successful, the result processing module further includes a first result processing unit configured to: judging whether the next test operation of the test service flow model is a jump operation test or not according to the jump information in the test case; if the test is the jump operation test, executing the jump operation, and executing the test operation of the next test case according to the execution sequence under the condition that the execution of the jump operation is successful; if the execution of the skip operation fails, interrupting the test of the test service flow model and sending alarm information; wherein the alarm information comprises at least one of: recording test failure and test case information of the test failure; and if the test result is not the jump operation test, executing the test operation of the next test case according to the execution sequence.
Further, the result processing module further includes a second result processing unit, configured to interrupt the test of the test service flow model and send an alarm message if the test result data of each test case fails to be compared with the expected result data corresponding to the test case, where the alarm message includes at least one of: test failure records and test case information when the test fails.
Further, the system further includes a storage module, configured to store the test record of each test case in the test service flow model, the test result data of each test case, and the comparison result between the test result data of each test case and the expected result data corresponding to the test case in the database of the test management server.
In the embodiment of the invention, the test management server adopts the method to facilitate and use the test management of the test case, realize the full-automatic execution of the test of the service flow related to the software before the online, and achieve the technical effect of saving the test time of the service flow interface of the software before the online.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for automated request testing according to an embodiment of the present invention;
FIG. 2 is a flowchart of executing a test case according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an automated request test system according to an embodiment of the present invention;
fig. 4 is a schematic diagram of another system for automatically requesting a test according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
in accordance with an embodiment of the present invention, there is provided an embodiment of a method for automated request testing, it being noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than presented herein.
Fig. 1 is a flowchart of a method for automatically requesting a test according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, a test case to be tested is obtained.
The test case information to be tested comprises at least one of the following: the test result matching rule comprises a number, a Uniform Resource Locator (URL), a test parameter, a request result matching rule, a request result data structure, jump information and a jump link, wherein the URL is the test request link, the request result data structure is a field value list of expected result data, and the request result matching rule refers to a matching rule between the expected result data and the test result data in the field value list of the expected result data.
It should be noted that the test case to be tested is entered and stored in the database through the test management server.
Step S104, obtaining a test service flow model, and executing test operation on each test case based on flow parameters carried in the test service flow model, wherein the flow parameters comprise an execution sequence and/or an execution condition of the test case.
The test case execution conditions, the test case execution sequence, the time schedule and the like in the test service flow model can be set on a service flow model configuration panel in the test management server, and a plurality of test service flow models can be created.
And step S106, comparing the test result data of each test case with the corresponding expected result data in the process of executing the test operation on the test cases.
And step S108, if the comparison is successful, executing the test operation on the next test case according to the test service process model.
According to the method and the system for automatically requesting the test, the test case is conveniently and easily managed through the test management server, the full-automatic execution of the test of the service flow related to the software before the software is online is realized, and the technical effect of saving the test time of the service flow interface of the software before the software is online is achieved.
Specifically, as shown in fig. 2, the process of executing the test operation on each test case in step S104 includes the following steps:
step S201, a Uniform Resource Locator (URL) in a test case is obtained, wherein the test case is one of the test cases executed according to a test sequence in the test service flow model.
Step S202, sending the test request and the test parameters in the test case to the target server through the URL.
Step S203, obtaining test result data fed back by the target server based on the test parameters. The test result data is obtained after the target server tests the test parameters under the condition of responding to the test request.
Specifically, in step S108, executing a test operation on the next test case according to the test service flow model includes the following steps:
step S1081, judging whether the next test operation of the test service flow model is a skip operation test according to the skip information in the test case;
step S1082, if the jump operation test is performed, executing the jump operation, and executing the test operation of the next test case according to the execution sequence under the condition that the jump operation is successfully executed; if the execution of the skip operation fails, interrupting the test of the test service flow model and sending alarm information; wherein, the alarm information comprises at least one of the following: recording test failure and test case information of the test failure;
and step S1083, if the test is not the jump operation test, executing the test operation of the next test case according to the execution sequence.
In an optional implementation manner of the embodiment of the present invention, the method for automatically requesting a test further includes the following steps:
if the comparison between the test result data of each test case and the corresponding expected result data fails, interrupting the test of the test service flow model and sending out alarm information, wherein the alarm information comprises at least one of the following: test failure records and test case information when the test fails.
The embodiment of the invention timely feeds back the service flow with problems through the alarm information sent out when the test fails, and meanwhile, the test case information contained in the alarm information can accurately position the problem when the test fails.
It should be noted that, after the relevant personnel have repaired the found problem, the testing task of the testing business process model can be reinitiated as required.
In an embodiment of the present invention, the method further includes: and storing the test record of each test case, the test result data of each test case and the comparison result between the test result data of each test case and the corresponding expected result data in a database of the test management server in the test service flow model.
The working process of the embodiment of the present invention is described below with reference to specific examples, for example:
two obtained test cases are respectively:
test case 1:
numbering: 1
Test request chaining: jsp (i.e., login interface)
Testing parameters: root, password, ytest
Request result matching rules: equal
Request result data structure: type String value success
Jump information: is that
Jump linking: jsp
Test case 2:
numbering: 2
Test request chaining: jsp (i.e., query page)
Testing parameters: the mobile phone number is as follows: 139, identity card: 33010*
Request result matching rules: mobile phone number match139\ d {8} and identity card number match33010\ d {13}
Request result data structure: type String value mobile phone number; type String value identification number; number of lines of type int value;
jump information: is that
Jump linking: showdetail.jsp;
in the obtained test service flow model, the execution sequence of the test cases is as follows:
number 1- > number 2;
and then executing the test cases from the test case with the number 1 according to the execution sequence of the test cases:
and executing the number 1, if the returned test result is not 'success', directly interrupting the test of the test business process model, sending alarm information and informing related personnel.
And if the returned test result is 'success', executing a jump operation according to the jump information, and jumping to info.
And if the jump operation is successful, continuing to execute the test operation of the next test case according to the execution sequence, namely executing the test case 2.
Executing the number 2, returning a test result that the mobile phone number is 139-started, starting the identity number with 33010, and executing a jump operation according to the jump information to a showdetail.jsp in 28 lines; if the jump operation fails to be executed, the test of the test business process model is directly interrupted, and warning information is sent out to inform relevant personnel.
And after the test of the test service flow model is completely executed, storing the records of the two test cases, the test result data and other information into a database of the test management server.
As can be seen from the above description, the embodiment of the present invention can facilitate and facilitate management of the test case through the above method steps, and at the same time, when executing the service model to be tested, the model can be customized as needed, and when a problem occurs in the test process, the problem occurrence can be fed back and notified to the relevant technical staff, and the embodiment of the present invention can bring the following beneficial effects:
(1) the test case is convenient, easy to use and visual to manage;
(2) the creation of the test service flow model is the socket joint of the test case, and the creation is simple, easy to use, visual and high in efficiency;
(3) when the test business process model is executed, the test business process model can be executed according to needs, and when a problem occurs, an alarm can be fed back in real time, the problem is fed back, and relevant persons can be informed, so that the relevant persons can respond to the problem and quickly locate the problem at the first time, and the problem is processed;
(4) the test management server can send the execution request to the target server for multiple times as required according to the test service flow model, so that the test time of the service flow interface before the service flow interface is on line can be greatly saved, and the efficiency is improved.
Example two:
the embodiment of the present invention further provides an automatic test request system, which is mainly used for executing the method for automatic test request provided by the above-mentioned content of the embodiment of the present invention, and the following describes the system for automatic test request provided by the embodiment of the present invention in detail.
Fig. 3 is a schematic diagram of an automated request test system according to an embodiment of the present invention, and as shown in fig. 3, the automated request test system mainly includes: an acquisition module 10, an execution module 20, a comparison module 30 and a result processing module 40, wherein,
an obtaining module 10, configured to obtain a test case to be tested; wherein, the test case is the test case in the embodiment 1;
the execution module 20 is configured to obtain a test service flow model, and execute a test operation on each test case based on flow parameters carried in the test service flow model, where the flow parameters include an execution sequence and/or an execution condition of the test case;
the comparison module 30 is configured to compare the test result data of each test case with the expected result data corresponding to the test result data in the process of executing the test operation on the test case;
and the result processing module 40 is configured to, if the comparison is successful, execute a test operation on a next test case according to the test service flow model.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 4 is a block diagram of another execution module 20 according to an embodiment of the present invention, which includes: an acquisition unit 21, a transmission unit 22 and a reception unit 23, wherein,
an obtaining unit 21, configured to obtain a uniform resource locator URL in a test case;
a sending unit 22, configured to send a test request and test parameters in the test case to the target server through the URL;
and the receiving unit 23 is configured to obtain test result data fed back by the target server based on the test parameters.
Optionally, the result processing module 40 further comprises a first result processing unit 41 configured to:
if the test result data of each test case is successfully compared with the corresponding expected result data, judging whether the next test operation of the test service flow model is a skip operation test or not according to the skip information in the test case;
if the test is the jump operation test, executing the jump operation, and executing the test operation of the next test case according to the execution sequence under the condition that the execution of the jump operation is successful; if the execution of the skip operation fails, interrupting the test of the test service flow model and sending alarm information; wherein, the alarm information comprises at least one of the following: recording test failure and test case information of the test failure;
and if the test is not the jump operation test, executing the test operation of the next test case according to the execution sequence.
Optionally, the result processing module 40 further includes a second result processing unit 42, configured to interrupt the test of the test service flow model and send an alarm message if the test result data of each test case fails to be compared with the expected result data corresponding to the test case, where the alarm message includes at least one of: test failure records and test case information when the test fails.
Optionally, the system further includes a storage module 50, configured to store the test record of each test case in the test service flow model, the test result data of each test case, and the comparison result between the test result data of each test case and the expected result data corresponding to the test case in a database of the test management server.
As can be seen from the above description, the embodiment of the present invention can facilitate and facilitate management of the test case through the above method steps, and simultaneously, when the service model to be tested is executed, the model can be customized as needed, and when a problem occurs in the test process, the problem can be fed back and related technicians can be notified, thereby achieving the technical effect of saving the test time of the service flow interface before software is on-line.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1.一种自动化请求测试的方法,其特征在于,应用于测试管理服务器,包括:1. a method for automatic request testing, is characterized in that, is applied to test management server, comprises: 获取待测试的测试用例;Get the test case to be tested; 获取测试业务流程模型,并基于所述测试业务流程模型中携带的流程参数对各个所述测试用例执行测试操作,其中,所述流程参数包括所述测试用例的执行顺序和/或执行条件;Acquire a test business process model, and perform a test operation on each of the test cases based on process parameters carried in the test business process model, wherein the process parameters include the execution sequence and/or execution conditions of the test cases; 在对所述测试用例执行测试操作的过程中,将各个测试用例的测试结果数据与其对应的预期结果数据进行对比;In the process of performing the test operation on the test case, the test result data of each test case is compared with its corresponding expected result data; 若比较成功,则按照所述测试业务流程模型对下一个测试用例执行测试操作;If the comparison is successful, perform a test operation on the next test case according to the test business process model; 按照所述测试业务流程模型对下一个测试用例执行测试操作包括:Performing a test operation on the next test case according to the test business process model includes: 根据所述测试用例中的跳转信息判断所述测试业务流程模型的下一个测试操作是否为跳转操作测试;Determine whether the next test operation of the test business process model is a jump operation test according to the jump information in the test case; 如果是所述跳转操作测试,则执行跳转操作,并在跳转操作执行成功的情况下,按照所述执行顺序执行下一个测试用例的测试操作;若跳转操作执行失败,则中断所述测试业务流程模型的测试,并发出告警信息;其中,所述告警信息包括以下至少之一:测试失败记录,测试失败的测试用例信息;If it is the jump operation test, execute the jump operation, and if the jump operation is successfully executed, execute the test operation of the next test case according to the execution sequence; if the jump operation fails, interrupt all The test of the test business process model is described, and alarm information is sent; wherein, the alarm information includes at least one of the following: a test failure record, and test case information of the test failure; 如果不是所述跳转操作测试,则按照所述执行顺序执行下一个测试用例的测试操作。If it is not the jump operation test, the test operation of the next test case is executed according to the execution sequence. 2.根据权利要求1所述的方法,其特征在于,对各个所述测试用例执行测试操作包括:2. The method according to claim 1, wherein performing a test operation on each of the test cases comprises: 获取所述测试用例中的统一资源定位符;obtaining the uniform resource locator in the test case; 通过所述统一资源定位符向目标服务器发送所述测试用例中的测试请求和测试参数;Send the test request and test parameters in the test case to the target server through the uniform resource locator; 获取所述目标服务器基于所述测试参数反馈的测试结果数据。Acquire test result data fed back by the target server based on the test parameters. 3.根据权利要求1所述的方法,其特征在于,所述方法还包括:3. The method according to claim 1, wherein the method further comprises: 若各个测试用例的测试结果数据与其对应的预期结果数据比较失败,则中断所述测试业务流程模型的测试,发出告警信息,其中,所述告警信息包括以下至少之一:测试失败记录,测试失败时的测试用例信息。If the comparison between the test result data of each test case and the corresponding expected result data fails, the test of the test business process model is interrupted, and alarm information is sent, wherein the alarm information includes at least one of the following: test failure record, test failure test case information. 4.根据权利要求1所述的方法,其特征在于,所述方法还包括:4. The method according to claim 1, wherein the method further comprises: 将所述测试业务流程模型中各个测试用例的测试记录,各个测试用例的测试结果数据,各个测试用例的测试结果数据与其对应的预期结果数据之间的比较结果保存到所述测试管理服务器的数据库中。Save the test records of each test case in the test business process model, the test result data of each test case, and the comparison result between the test result data of each test case and its corresponding expected result data to the database of the test management server middle. 5.一种自动化请求测试的系统,其特征在于,应用于测试管理平台,包括:获取模块,执行模块,对比模块和结果处理模块,其中,5. a system of automatic request test, is characterized in that, is applied to test management platform, comprises: acquisition module, execution module, contrast module and result processing module, wherein, 所述获取模块,用于获取待测试的测试用例;The obtaining module is used to obtain the test case to be tested; 所述执行模块,用于获取测试业务流程模型,并基于所述测试业务流程模型中携带的流程参数对各个所述测试用例执行测试操作,其中,所述流程参数包括所述测试用例的执行顺序和/或执行条件;The execution module is configured to obtain a test business process model, and perform a test operation on each of the test cases based on process parameters carried in the test business process model, wherein the process parameters include the execution sequence of the test cases and/or conditions of execution; 所述对比模块,用于在对所述测试用例执行测试操作的过程中,将各个测试用例的测试结果数据与其对应的预期结果数据进行对比;The comparison module is used to compare the test result data of each test case with its corresponding expected result data in the process of performing the test operation on the test case; 所述结果处理模块,用于若比较成功,则按照所述测试业务流程模型对下一个测试用例执行测试操作;The result processing module is configured to perform a test operation on the next test case according to the test business process model if the comparison is successful; 若比较成功,所述结果处理模块还包括第一结果处理单元用于:If the comparison is successful, the result processing module further includes a first result processing unit for: 根据所述测试用例中的跳转信息判断所述测试业务流程模型的下一个测试操作是否为跳转操作测试;Determine whether the next test operation of the test business process model is a jump operation test according to the jump information in the test case; 如果是所述跳转操作测试,则执行跳转操作,并在跳转操作执行成功的情况下,按照所述执行顺序执行下一个测试用例的测试操作;若跳转操作执行失败,则中断所述测试业务流程模型的测试,并发出告警信息;其中,所述告警信息包括以下至少之一:测试失败记录,测试失败的测试用例信息;If it is the jump operation test, execute the jump operation, and if the jump operation is successfully executed, execute the test operation of the next test case according to the execution sequence; if the jump operation fails, interrupt all Describe the test of the test business process model, and send alarm information; wherein, the alarm information includes at least one of the following: test failure record, test case information of test failure; 如果不是所述跳转操作测试,则按照所述执行顺序执行下一个测试用例的测试操作。If it is not the jump operation test, the test operation of the next test case is executed according to the execution sequence. 6.根据权利要求5所述的系统,其特征在于,所述执行模块包括:获取单元,发送单元和接收单元,其中,6. The system according to claim 5, wherein the execution module comprises: an obtaining unit, a sending unit and a receiving unit, wherein: 所述获取单元,用于获取所述测试用例中的统一资源定位符;the obtaining unit, configured to obtain the uniform resource locator in the test case; 所述发送单元,用于通过所述统一资源定位符向目标服务器发送所述测试用例中的测试请求和测试参数;the sending unit, configured to send the test request and test parameters in the test case to the target server through the uniform resource locator; 所述接收单元,用于获取所述目标服务器基于所述测试参数反馈的测试结果数据。The receiving unit is configured to acquire test result data fed back by the target server based on the test parameters. 7.根据权利要求5所述的系统,其特征在于,所述结果处理模块还包括:7. The system according to claim 5, wherein the result processing module further comprises: 第二结果处理单元,用于若各个测试用例的测试结果数据与其对应的预期结果数据比较失败,则中断所述测试业务流程模型的测试,发出告警信息,其中,所述告警信息包括以下至少之一:测试失败记录,测试失败时的测试用例信息。The second result processing unit is configured to interrupt the test of the test business process model if the comparison between the test result data of each test case and its corresponding expected result data fails, and send out alarm information, wherein the alarm information includes at least one of the following 1: Test failure record, test case information when the test fails. 8.根据权利要求5所述的系统,其特征在于,所述系统还包括:8. The system of claim 5, wherein the system further comprises: 存储模块,用于将所述测试业务流程模型中各个测试用例的测试记录,各个测试用例的测试结果数据,各个测试用例的测试结果数据与其对应的预期结果数据之间的比较结果保存到所述测试管理服务器的数据库中。The storage module is used to store the test records of each test case in the test business process model, the test result data of each test case, and the comparison result between the test result data of each test case and its corresponding expected result data to the In the database of the test management server.
CN201811205854.8A 2018-10-16 2018-10-16 A method and system for automated request testing Active CN109165170B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811205854.8A CN109165170B (en) 2018-10-16 2018-10-16 A method and system for automated request testing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811205854.8A CN109165170B (en) 2018-10-16 2018-10-16 A method and system for automated request testing

Publications (2)

Publication Number Publication Date
CN109165170A CN109165170A (en) 2019-01-08
CN109165170B true CN109165170B (en) 2022-03-11

Family

ID=64878340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811205854.8A Active CN109165170B (en) 2018-10-16 2018-10-16 A method and system for automated request testing

Country Status (1)

Country Link
CN (1) CN109165170B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110287105A (en) * 2019-05-22 2019-09-27 平安普惠企业管理有限公司 Document version test method, device, equipment and readable storage medium storing program for executing
CN110413530B (en) * 2019-08-02 2024-01-05 中国工商银行股份有限公司 Behavior execution method and device
CN110888801A (en) * 2019-10-23 2020-03-17 贝壳技术有限公司 Software program testing method and device, storage medium and electronic equipment
CN111552634A (en) * 2020-03-30 2020-08-18 深圳壹账通智能科技有限公司 Method and device for testing front-end system and storage medium
CN112035363A (en) * 2020-09-01 2020-12-04 中国银行股份有限公司 Interface automatic testing method and device
CN112732564A (en) * 2020-12-30 2021-04-30 武汉海昌信息技术有限公司 Method and device for realizing process engine of business system
CN113448844B (en) * 2021-06-21 2022-10-25 青岛海尔科技有限公司 Method and device for regression testing and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377127A (en) * 2012-04-28 2013-10-30 阿里巴巴集团控股有限公司 Development testing system, testing method and device for webpage product
CN104965790A (en) * 2015-07-17 2015-10-07 小米科技有限责任公司 Keyword-driven software testing method and system
CN106201891A (en) * 2016-07-19 2016-12-07 意昂神州(北京)科技有限公司 A kind of model automatization method of testing and device
CN106886494A (en) * 2017-03-07 2017-06-23 深圳国泰安教育技术股份有限公司 A kind of automatic interface testing method and its system
CN107741905A (en) * 2017-09-11 2018-02-27 珠海格力电器股份有限公司 Test case, analytic model thereof, execution method, storage medium and processor
CN108268373A (en) * 2017-09-21 2018-07-10 平安科技(深圳)有限公司 Automatic test cases management method, device, equipment and storage medium
CN108563564A (en) * 2018-04-02 2018-09-21 上海畅联智融通讯科技有限公司 terminal man-machine interface test method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246770B (en) * 2013-05-08 2015-10-14 南京大学 A kind of system action emulation mode of based upon activities graph model
US9727450B2 (en) * 2015-03-27 2017-08-08 Syntel, Inc. Model-based software application testing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377127A (en) * 2012-04-28 2013-10-30 阿里巴巴集团控股有限公司 Development testing system, testing method and device for webpage product
CN104965790A (en) * 2015-07-17 2015-10-07 小米科技有限责任公司 Keyword-driven software testing method and system
CN106201891A (en) * 2016-07-19 2016-12-07 意昂神州(北京)科技有限公司 A kind of model automatization method of testing and device
CN106886494A (en) * 2017-03-07 2017-06-23 深圳国泰安教育技术股份有限公司 A kind of automatic interface testing method and its system
CN107741905A (en) * 2017-09-11 2018-02-27 珠海格力电器股份有限公司 Test case, analytic model thereof, execution method, storage medium and processor
CN108268373A (en) * 2017-09-21 2018-07-10 平安科技(深圳)有限公司 Automatic test cases management method, device, equipment and storage medium
CN108563564A (en) * 2018-04-02 2018-09-21 上海畅联智融通讯科技有限公司 terminal man-machine interface test method and system

Also Published As

Publication number Publication date
CN109165170A (en) 2019-01-08

Similar Documents

Publication Publication Date Title
CN109165170B (en) A method and system for automated request testing
CN111399873B (en) Model updating method and device
US10184882B2 (en) System and method for providing user guidance for electronic device processing
CN110908909B (en) Automatic test method, device, storage medium and equipment
CN107896244B (en) Version file distribution method, client and server
CN112069073B (en) Test case management method, terminal and storage medium
US10552242B2 (en) Runtime failure detection and correction
CN111522738A (en) Test method and device of micro-service system, storage medium and electronic equipment
CN107451040A (en) Localization method, device and the computer-readable recording medium of failure cause
CN110088744A (en) A database maintenance method and system thereof
US11263072B2 (en) Recovery of application from error
CN112905437A (en) Method and device for testing case and storage medium
CN113538725A (en) Hardware product testing method and related equipment
CN109144801A (en) It is a kind of to be directed to the test method of MOC card, device and equipment in server
CN106815137A (en) Ui testing method and apparatus
CN111338869A (en) Configuration parameter management method, device, device and storage medium
CN114564381A (en) A method for production testing of IoT devices
CN107102938B (en) Test script updating method and device
CN109471646A (en) A method, device and storage medium for upgrading BMC version of a server
CN109508203B (en) Method, device and system for determining version consistency
CN113452533A (en) Charging self-inspection and self-healing method and device, computer equipment and storage medium
CN116662197A (en) Automatic interface testing method, system, computer and readable storage medium
CN117193798A (en) Application deployment method, apparatus, device, readable storage medium and program product
CN113610535B (en) Risk monitoring method and device suitable for consumption stage business process
CN114650211A (en) Fault repairing method, device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190108

Assignee: Hangzhou Anheng Information Security Technology Co.,Ltd.

Assignor: Dbappsecurity Co.,Ltd.

Contract record no.: X2024980043367

Denomination of invention: A method and system for automated request testing

Granted publication date: 20220311

License type: Common License

Record date: 20241231

EE01 Entry into force of recordation of patent licensing contract