CN114064469A - Interface automation test method and storage medium - Google Patents

Interface automation test method and storage medium Download PDF

Info

Publication number
CN114064469A
CN114064469A CN202111319150.5A CN202111319150A CN114064469A CN 114064469 A CN114064469 A CN 114064469A CN 202111319150 A CN202111319150 A CN 202111319150A CN 114064469 A CN114064469 A CN 114064469A
Authority
CN
China
Prior art keywords
test
data
interface
swagger
analyzing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111319150.5A
Other languages
Chinese (zh)
Inventor
罗建新
张荣荣
郑敏
张怀刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Zefu Software Co ltd
Original Assignee
Fujian Zefu Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Zefu Software Co ltd filed Critical Fujian Zefu Software Co ltd
Priority to CN202111319150.5A priority Critical patent/CN114064469A/en
Publication of CN114064469A publication Critical patent/CN114064469A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

An interface automatic test method and a storage medium comprise the following steps: building an interface test framework, wherein the test framework is used for analyzing a json description file of swagger and analyzing a postman script, and storing collected test cases to form a test case library, wherein the test case library comprises test data, an assertion rule and a mapping library; the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test. According to the scheme, price asking description and script analysis can be performed in the swagger-based framework, a large number of counter examples are automatically generated when the service interface is provided externally, full coverage of test data can be achieved, and efficiency is improved.

Description

Interface automation test method and storage medium
Technical Field
The invention relates to the field of data analysis, in particular to a method and a storage medium capable of distinguishing purchasing risks.
Background
With the continuous development of the technology, the requirements on the software quality are higher and higher. During the whole software life cycle, a piece of software needs to be tested in a large amount before being online, so that the quality of the software is improved.
At present, when interface testing is carried out, the interface is generally tested in a mode of manually writing an interface test script, however, the number of the interfaces is generally large and can be increased, and in order to meet the full coverage of the interface testing, the number of interface test codes needing to be written by a test engineer is often extremely large, so that the labor cost of the whole process is high, and the testing efficiency is low.
Disclosure of Invention
Therefore, it is necessary to provide a method for automatically testing an interface, which can meet the requirement of the prior art for automatically generating an interface test case.
In order to achieve the above object, the inventor provides an interface automated testing method, which includes the following steps: an interface testing framework is set up and used for analyzing the json description file of swagger and analyzing the postman script,
storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test.
Specifically, the test data is data for generating a test case, and includes letters, special characters, boundary value data, and equivalence class data.
Specifically, the method also comprises the step of converting the test framework into a postman script after the tested project integrates swaggers and provides http service to the outside.
Specifically, the method further comprises the steps of carrying out interface test based on the test case after data completion, sending a simulation request to an interface, analyzing an interface response, and generating a test report based on the all.
An interface automation test storage medium storing a computer program which when executed performs steps comprising: an interface testing framework is set up and used for analyzing the json description file of swagger and analyzing the postman script,
storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test.
Specifically, the test data is data for generating a test case, and includes letters, special characters, boundary value data, and equivalence class data.
Specifically, the computer program executes the tested project integration swagger when being run, provides http service for the outside and then executes the steps of converting the test framework into a postman script.
Further, the computer program further executes a step when being executed, the step of performing interface test based on the test case after the data completion, analyzing an interface response after sending a simulation request to the interface, and generating a test report based on the all.
Through the scheme, price asking description and script analysis can be carried out in the swagger-based framework, a large number of counter examples are automatically generated when the service interface is provided externally, full coverage of test data can be achieved, difficulty in manually writing codes is reduced, and efficiency is improved.
Drawings
FIG. 1 is a flow chart of a method for automated testing of an interface according to an embodiment;
fig. 2 is a schematic diagram of an interface automated testing storage medium according to an embodiment.
Detailed Description
To explain technical contents, structural features, and objects and effects of the technical solutions in detail, the following detailed description is given with reference to the accompanying drawings in conjunction with the embodiments.
In real time as shown in fig. 1, an interface automation test method includes the following steps: s1, building an interface testing framework, wherein the testing framework is used for analyzing the json description file of swagger and analyzing the postman script,
s2, storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
s3 tested items are integrated with swaggers and provide http service to the outside, and the tested items receive positive examples of test cases and perform data completion, wherein the data completion is that the test data are generated in a negative example mode according to assertion rules, the generated negative examples are stored, and the positive examples or the negative examples are used for performing interface automatic testing.
In a specific embodiment, the test case can automatically generate a counter example by negating the positive example of the test case through the assertion rule, the generation of the counter example is mainly based on a boundary value, equivalence class conversion and a special character test case rule, for example, the boundary value, and when an input field is a name, that is, when a boundary value exists, for example, more than 10 characters (the name of the assertion rule cannot be more than 10 characters), the name of a person with more than 10 characters can be automatically generated as the counter example according to the known assertion rule. Also, for example, special characters (the name of the assertion rule cannot include a special symbol such as @), cases that can be automatically obtained by taking test cases other than the assertion rule for the positive case are collectively called as negative cases according to the positive case and the assertion rule. The normal case can be obtained manually by receiving the writing of an engineer, and can also be obtained based on the existing database. Therefore, based on the scheme, price asking description and script analysis can be performed in the swagger-based framework, a large number of counter examples can be generated automatically when the service interface is provided externally, full coverage of test data can be achieved, difficulty in manually writing codes is reduced, and efficiency is improved.
In some specific embodiments, the test data is data for generating test cases, and includes letters, special characters, boundary value data, and equivalent class data. Through the data type design, the type of the test case can be perfected, and the effect of automatically generating the test case of a positive example and a negative example can be achieved.
In other specific embodiments, the step of converting the test framework into a postman script is further included after the tested project integrates swaggers and provides http service to the outside. The Swagger is a standard and complete framework, the interface testing framework can also comprise an interface testing framework based on a pytest + allow + aiohttp combination, and is used for generating, describing, calling and visualizing RESTful style Web services, and workload brought by traversing and analyzing interfaces and risk of introducing new BUG are avoided by using the Swagger. Through the scheme, the stability of the interface test can be effectively improved.
In some other specific embodiments, the method further includes the steps of performing an interface test based on the test case after the data completion, sending a simulation request to the interface, analyzing an interface response, and generating a test report based on the all. By the scheme, the interface test can be performed by using the test case with wider coverage, and the test result is visualized by using the mode of generating the test report by all, so that the basis is obtained for manual comparison. The scheme improves the practicability of the invention.
In the embodiment shown in fig. 2, an interface automation test storage medium 2 is comprised, storing a computer program which when executed performs the steps comprising: an interface testing framework is set up and used for analyzing the json description file of swagger and analyzing the postman script,
storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test.
Specifically, the test data is data for generating a test case, and includes letters, special characters, boundary value data, and equivalence class data.
Specifically, the computer program executes the tested project integration swagger when being run, provides http service for the outside and then executes the steps of converting the test framework into a postman script.
Further, the computer program further executes a step when being executed, the step of performing interface test based on the test case after the data completion, analyzing an interface response after sending a simulation request to the interface, and generating a test report based on the all.
It should be noted that, although the above embodiments have been described herein, the invention is not limited thereto. Therefore, based on the innovative concepts of the present invention, the technical solutions of the present invention can be directly or indirectly applied to other related technical fields by making changes and modifications to the embodiments described herein, or by using equivalent structures or equivalent processes performed in the content of the present specification and the attached drawings, which are included in the scope of the present invention.

Claims (8)

1. An automatic interface testing method is characterized by comprising the following steps: an interface testing framework is set up and used for analyzing the json description file of swagger and analyzing the postman script,
storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test.
2. The automated interface testing method according to claim 1, wherein the test data is data for generating test cases, and includes letters, special characters, boundary value data, and equivalence class data.
3. The automated interface testing method according to claim 1, wherein the tested project integrates swagger and provides http service to outside, and further comprising the step of converting the testing framework into a postman script.
4. The automated interface testing method according to claim 1, further comprising the steps of performing an interface test based on the test case after the data completion, sending a simulation request to the interface, analyzing an interface response, and generating a test report based on the all.
5. An interface automation test storage medium storing a computer program that when executed performs steps comprising: an interface testing framework is set up and used for analyzing the json description file of swagger and analyzing the postman script,
storing the collected test cases to form a test case library, wherein the test case library comprises test data, assertion rules and a mapping library;
the tested project is integrated with swagger and provides http service to the outside, and receives a positive example of a test case and carries out data completion, wherein the data completion is that the test data is generated in a negative example mode according to an assertion rule, the generated negative example is stored, and the positive example or the negative example is used for carrying out interface automatic test.
6. The interface automation test storage medium of claim 5 wherein the test data is data for generating test cases, including letters, special characters, boundary value data, and equivalence class data.
7. The medium of claim 5, wherein the computer program, when executed, performs steps comprising converting the test framework into a postman script after executing a project integration swagger under test and providing http services to the outside.
8. The medium of claim 5, wherein the computer program when executed further performs steps comprising performing an interface test based on the data-supplemented test case, parsing an interface response after sending a simulation request to the interface, and generating a test report based on all.
CN202111319150.5A 2021-11-09 2021-11-09 Interface automation test method and storage medium Pending CN114064469A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111319150.5A CN114064469A (en) 2021-11-09 2021-11-09 Interface automation test method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111319150.5A CN114064469A (en) 2021-11-09 2021-11-09 Interface automation test method and storage medium

Publications (1)

Publication Number Publication Date
CN114064469A true CN114064469A (en) 2022-02-18

Family

ID=80273728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111319150.5A Pending CN114064469A (en) 2021-11-09 2021-11-09 Interface automation test method and storage medium

Country Status (1)

Country Link
CN (1) CN114064469A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686463A (en) * 2022-12-30 2023-02-03 科讯嘉联信息技术有限公司 Python and Excel-based interface automation framework control system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686463A (en) * 2022-12-30 2023-02-03 科讯嘉联信息技术有限公司 Python and Excel-based interface automation framework control system

Similar Documents

Publication Publication Date Title
CN107844424B (en) Model-based testing system and method
CN110716870B (en) Automatic service testing method and device
CN110764753A (en) Business logic code generation method, device, equipment and storage medium
CN108345532A (en) A kind of automatic test cases generation method and device
US20120150820A1 (en) System and method for testing data at a data warehouse
CN101996131A (en) Automatic test method and automatic test platform for graphic user interface (GUI) based on x extensive makeup language (XML) packaging key word
CN104239219A (en) Software defect positioning technology on-line evaluating and experimenting platform and method based on coverage
CN104657274A (en) Method and device for testing software interface
CN112181854A (en) Method, device, equipment and storage medium for generating flow automation script
CN106484613A (en) A kind of interface automated test frame based on fitnese
CN114064469A (en) Interface automation test method and storage medium
CN112732237B (en) Method and system for constructing code-free development technology model
CN114003451A (en) Interface test method, device, system and medium
CN111176995B (en) Test method and test system based on big data test case
Villalobos-Arias et al. Evaluation of a model‐based testing platform for Java applications
CN116599864A (en) Communication test system, method, equipment and storage medium for automatic interface generation and use case modular development
WO2023160402A1 (en) Data modeling method and apparatus, and device and storage medium
CN112667517A (en) Method, device, equipment and storage medium for acquiring automatic test script
CN110286882B (en) Foreground system design and verification method based on model detection
CN115469849B (en) Service processing system, method, electronic equipment and storage medium
CN116383061A (en) Method and related device for testing basic platform interface of substation control system
CN115495362A (en) Method, device, storage medium and computer equipment for generating test code
CN113672509A (en) Automatic testing method, device, testing platform and storage medium
CN113961451A (en) Automatic test system for software development tasks of working platform
CN111078543B (en) System dynamic test method and test device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination