CN117370217A - Automatic interface test result generation method based on python - Google Patents

Automatic interface test result generation method based on python Download PDF

Info

Publication number
CN117370217A
CN117370217A CN202311676046.0A CN202311676046A CN117370217A CN 117370217 A CN117370217 A CN 117370217A CN 202311676046 A CN202311676046 A CN 202311676046A CN 117370217 A CN117370217 A CN 117370217A
Authority
CN
China
Prior art keywords
test
function
response
interface
class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311676046.0A
Other languages
Chinese (zh)
Other versions
CN117370217B (en
Inventor
王振洁
庞志斌
刘斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Hualai Technology Co Ltd
Original Assignee
Tianjin Hualai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Hualai Technology Co Ltd filed Critical Tianjin Hualai Technology Co Ltd
Priority to CN202311676046.0A priority Critical patent/CN117370217B/en
Publication of CN117370217A publication Critical patent/CN117370217A/en
Application granted granted Critical
Publication of CN117370217B publication Critical patent/CN117370217B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3644Software debugging by instrumenting at runtime
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides a python-based interface test result automatic generation method, which comprises the following steps: constructing an automatic interface testing frame, and performing large-scale interface testing; reading test case data in an interface automation test frame, and performing parameter variable replacement operation; defining a request sending class and a response processing class, and respectively performing request sending and response checking operations; writing the test result into a test report, and sending the test report through mail. The invention has the beneficial effects that: the automatic test framework can eliminate human errors, each test is executed according to the same standard, the accuracy of the test is improved, the requirement of manual test resources is reduced by the automatic test framework, and the cost is reduced.

Description

Automatic interface test result generation method based on python
Technical Field
The invention belongs to the technical field of computers, and particularly relates to an automatic interface test result generation method based on python.
Background
The interface test plays an important role in the software test, is focused on testing interaction among different components, modules or systems, ensures that the components and the modules can work correctly and cooperatively, can discover and solve the integration problem early through the interface test, prevents errors from diffusing to the whole software system, and is helpful for verifying functions of each component and data transmission, improving maintainability of the system, reducing later repair cost, enabling the interface test to discover many bugs which cannot be discovered through operation on a page without depending on a front page, covering more scenes which cannot be tested at the front end, enabling the interface test to discover and solve the bugs early, and shortening the online time of the whole project.
Tool software is mainly used for debugging connectivity of interfaces in a development stage, is mainly used for companies with smaller interface volumes mainly used for manual testers, cannot complete interface tests on the whole project and save test results, can write complex scripts for processing data and processing assertions in complex scenes, can have difficulty for people without programming experience, occupies a large amount of resources including memory and cpu when simulating large-scale concurrent users, can burden development machines and influence operation of other tasks, and can not meet requirements of common interface test tools in industry for testing the whole project and saving results.
Disclosure of Invention
In view of the above, the present invention aims to provide an automatic generating method of interface test results based on python, so as to solve at least one of the partial technical problems.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
an automatic generation method of interface test results based on python comprises the following steps:
constructing an automatic interface testing frame, and performing large-scale interface testing;
reading test case data in an interface automation test frame, and performing parameter variable replacement operation;
defining a request sending class and a response processing class, and respectively performing request sending and response checking operations;
writing the test result into a test report, and sending the test report through mail.
Further, the interface automation test framework is provided with a multi-level directory, and the functions of the multi-level directory comprise:
storing a starting file and a multithreading file, and driving the frame to operate; storing the test cases; storing the processing character strings and the configuration files related to the read data; analyzing the data and sending a request; storing a log file; the test report is stored.
Further, the process of reading test case data in the interface automation test framework and performing parameter variable replacement operation includes:
reading test case data by a file path of a specified file;
opening a designated file, and traversing test data of each line from a second line of the file;
extracting the data of the 2 nd to 7 th columns in the test data of each row as test case information;
transmitting the request parameters and the request header to a parameter substitution function, and executing parameter variable substitution operation;
the parameter substitution function traverses the predefined function to replace the function name portion in the string.
Further, the parameter substitution function traverses the predefined function, and the process of substituting the function name part in the character string comprises the following steps:
traversing a predefined function mapping function by a parameter replacement function, and storing function names and corresponding function objects in the function mapping function;
detecting whether the input character string contains the function name in the function mapping function, if so, taking out the corresponding function object and executing the function;
and replacing the function name part in the original character string with the function execution result.
Further, defining a request sending class and a response processing class, and respectively performing request sending and response checking operations, wherein the process of defining the request sending class and performing the request sending operation includes:
according to the parameter types in the test cases, dynamically selecting corresponding parameters to send requests;
after the request is successfully sent, the response result is converted into a Json format and a character string format and is stored respectively.
Further, defining a request sending class and a response processing class, and respectively performing request sending and response checking operations, wherein defining the response processing class and performing the response checking operations includes:
defining an analytic response class, defining a constructor in the analytic response class, and receiving two parameters of a check rule and a response result by the constructor;
wherein the verification rule is in a character string format, and the response result is in a Json format;
in the constructor, the verification rules and the response results are saved into the attributes of the object, and the core function execution verification logic is defined.
Further, the process of defining the core function to perform the check logic includes:
splitting a plurality of verification conditions according to a verification rule, wherein each verification condition comprises a key name, an operator and an expected value;
acquiring an actual value from a response result according to the key name, and forming a comparison expression by the actual value, the expected value and the operator;
executing the comparison expression to judge whether the conditions are met, circularly checking each check condition and recording the check result;
and after the verification is finished, judging whether the verification is passed or not according to the result of the verification condition.
Further, the process of writing the test result into the test report and sending the test report by mail includes:
reading files under a designated directory, and obtaining a path list of all test case files meeting the conditions;
reading test case data in the test case file, sending a request to acquire interface response, and checking the response;
updating the total use case number and the passing use case number, writing the test result into a test report, and allowing each use case to run concurrently through independent threads to wait for all threads to finish;
and sending a report mail after the execution is finished.
Compared with the prior art, the automatic generation method of the interface test result based on python has the following beneficial effects:
new test cases can be easily added, and the requirements of the system on continuous change are met;
a large number of test cases can be performed in a short time;
the test result can be clearly and accurately displayed and sent to related personnel;
the automatic test framework can eliminate human errors, each test is executed according to the same standard, and the accuracy of the test is improved;
the automatic test framework reduces the requirement of manual test resources and reduces the cost.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention. In the drawings:
fig. 1 is a flow chart of a python-based method for automatically generating an interface test result according to an embodiment of the present invention.
Detailed Description
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
The invention will be described in detail below with reference to the drawings in connection with embodiments.
An automatic generation method of interface test results based on python comprises the following steps:
s1, constructing an automatic interface testing frame, and performing large-scale interface testing;
s2, reading test case data in an interface automation test frame, and performing parameter variable replacement operation;
s3, defining a request sending class and a response processing class, and respectively performing request sending and response checking operations;
and S4, writing the test result into a test report, and sending the test report through a mail.
The interface automation test framework in the step S1 is provided with a multi-stage catalog, and the structure of the multi-stage catalog is as follows: bin\cases\config\lib\logs\report;
the functions of each stage in the multi-stage directory structure are as follows:
the bin directory stores a starting file and multithreading related codes and is used for driving the frame to run;
the cases catalog stores test cases, so that the management and maintenance are convenient;
the config catalog stores configuration files, including related function configuration such as processing character strings, reading data and the like;
the lib catalogue is a core function module of the framework, analyzes data and sends a request;
the logs catalog is used for storing log files, so that problems can be conveniently positioned and debugged;
the report catalog is used to store test reports.
The structure of the multi-level directory is used for clearly dividing the directory according to functions, so that codes and resources are convenient to manage and maintain, logs and reports are stored independently, backtracking and tracking are convenient, and the core functions and configuration are relatively isolated, so that the safety of each function and each file is guaranteed.
In step S2, the process of reading the test case data in the interface automation test framework and performing the parameter variable replacement operation includes:
s21, defining a read_excel function, and reading test case data from a specified Excel file path;
s22, opening an Excel file, and traversing each row of test data from the second row;
s23, extracting data from the 2 nd column to the 7 th column of each row as test case information;
s24, transmitting the request parameters and the request header (the 3 rd column and the 4 th column) to a replace_param function for parameterization replacement;
s25, traversing a predefined function by the replace_param function, replacing the function name in the character string as an actual value, and realizing parameterized replacement;
s26, adding the replaced whole line of test data into a case_list list;
reading Excel data, packaging in a class, realizing parameterization through a replace_param, uniformly processing, and returning to a data list in a standard format;
the case_list serves as a delivery and transfer carrier of test case data and is used for storing all the read and processed test case data, and specifically comprises the following steps: uniformly adding each row of test case data read from the Excel file into the case_list after parameterization;
the data format in the case_list is in a standardized dictionary form, so that various data can be conveniently and directly taken out later, and in the subsequent case run class, the execution of the test case can be carried out according to the data in the case_list;
and after the execution is completed, the state of the application example in the case_list is updated, and the final report generation is also based on the test result data in the case_list.
In step S25, the replace_parameter function traverses the predefined function, replaces the function name with the actual value in the character string, and realizes the parameterized replacement, which includes:
s251, traversing a redeposition_param function by a predefined func_map function map, wherein function names and corresponding function objects are stored in the func_map function;
s252, detecting whether the input character string contains the function name in the func_map, and if so, taking out the corresponding function object and executing the function; if not, the current input character string is not required to be processed;
s253, replacing the function name part in the original character string with the function execution result, so that the dynamic acquisition and replacement of the parameter value are realized.
In step S3, a request sending class and a response processing class are defined, and request sending and response checking operations are performed respectively, where the process of defining the request sending class and performing the request sending operation includes:
s31, judging whether test data in the test case is Json format data, dynamically selecting to use params or Json parameters to send requests so as to adapt to request formats of different interfaces, and supporting both form requests and Json requests;
and S32, after the request is successfully sent, converting a Response result Response into a Json object and storing the Json object in self.
The effect of storing Response as two versions of Json object and string is:
the Response in the Json object format is convenient for directly taking the field for verification, and the Response in the character string format is convenient for outputting when the failure reason is recorded;
the design decouples the format of the request parameters, stores the response results of the two formats at the same time, is convenient for the output of the follow-up check sum failure information, and is a complete function for transmitting the request type.
In step S3, a request sending class and a response processing class are defined, and request sending and response checking operations are performed respectively, where the process of defining the response processing class and performing the response checking operation includes:
s33, a custom ParseResponse class is used for checking the interface response result;
s34, defining a constructor __ init __ in the ParseResponse class, __ init __ receives two parameters:
check_str: a verification rule in the form of a character string;
response: the interface calls the returned response result (json format);
s35, in a __ init __ function, storing two parameters of check_str and response into the attribute of the object, and preparing for subsequent verification; and saves the check rule and the response result in the attribute of the object in the __ init __ function, and defines the core function check_response () to execute the check logic.
The process of defining the core function check_response () to execute the check logic in step S35 includes:
s351, splitting each check condition according to a rule character string check_str, wherein each condition comprises a key name, an operator and an expected value;
s352, acquiring an actual value from the response according to the key name, and forming a comparison expression by the actual value, the expected value and the operator;
s353, executing a comparison expression by using the eval function, and judging whether a condition is satisfied;
s354, circularly checking each condition by a check_response function, recording a checking result, and judging whether the whole passes or not according to the individual condition result after the checking is finished;
the check_response () function can be called to perform verification, thus realizing automatic verification of a response result and supporting dynamic configuration of verification logic through a rule string.
In step S4, the process of writing the test result into the test report and sending the test report through mail includes:
s41, defining a CaseRun class for acquiring, executing, counting and reporting mail sending work of test cases;
s42, defining a get_case function, and obtaining a test case file path list meeting all conditions (Excel) by reading files under a specified directory;
s43, creating ParamDeal examples, which are used for reading test case data in an Excel file, sending a request by using a MyRequest example, acquiring interface response, and checking the response by using a ParseResponse example;
s44, updating the total case number and the passing case number, storing the result into a response_list, writing the test result into an Excel report, and adding the report name into the report_list;
s45, each use case runs through independent threads concurrently, and waits for all threads to finish; after the execution is finished, the report mail is sent through the send_mail function.
The working process comprises the following steps:
writing an interface automation test case, wherein the automation test case comprises the following steps: project name, request URL, request mode, request parameters (variable form), header, whether the entry is Json, expected result;
defining ParamDeal class, which is mainly used for processing parameters and reading of interface test cases, judging whether the test cases are successfully read, defining a read_excel function for reading the test case data from a specified Excel file, converting the read_excel function into a List and returning, calling a replace_parameter and a str_to_direct function in the read_excel function, wherein the replace_parameter is used for replacing a parameterized mark in a character string, acquiring an actual parameter value by calling a predefined function, replacing the parameterized mark with an actual value, and converting the character string form of a request parameter into a dictionary by the str_to_direct function;
defining MyRequest class for sending HTTP request, firstly selecting to use params or Json parameters to send request according to is_json parameters, then converting Response into Json object, storing in self.result, converting Response into character string and storing in self text, and recording failure cause conveniently;
checking whether a specified condition is met or not through the correctness of a ParseResponse check result, if the condition is not met, recording failure reasons and states, counting successful test cases, failed test cases and failure reasons, and writing the results into the test cases;
after all the tests are finished, a test report is generated, and the test report is sent to testers and developers through mails.
Those of ordinary skill in the art will appreciate that the elements and method steps of each example described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the elements and steps of each example have been described generally in terms of functionality in the foregoing description to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in this application, it should be understood that the disclosed methods and systems may be implemented in other ways. For example, the above-described division of units is merely a logical function division, and there may be another division manner when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. The units may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention, and are intended to be included within the scope of the appended claims and description.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (8)

1. An automatic generation method of interface test results based on python is characterized by comprising the following steps:
constructing an automatic interface testing frame, and performing large-scale interface testing;
reading test case data in an interface automation test frame, and replacing parameter variables;
defining a request sending class and a response processing class, and respectively performing request sending and response checking operations;
writing the test result into a test report, and sending the test report through mail.
2. The automatic generating method of the interface test result based on python according to claim 1, wherein the method comprises the following steps:
the interface automation test framework is provided with a plurality of levels of catalogues, and the functions of each level of catalogues in the plurality of levels of catalogues are respectively as follows:
storing a starting file and a multithreading file, and driving the frame to operate;
storing the test cases;
storing the processing character strings and the configuration files related to the read data;
analyzing the data and sending a request;
storing a log file;
the test report is stored.
3. The automatic generating method of the interface test result based on python according to claim 1, wherein the method comprises the following steps:
the process of reading test case data in the interface automation test framework and performing parameter variable replacement operation comprises the following steps:
reading test case data by a file path of a specified file;
opening a designated file, and traversing test data of each line from a second line of the file;
extracting the data of the 2 nd to 7 th columns in the test data of each row as test case information;
transmitting the request parameters and the request header to a parameter substitution function, and executing parameter variable substitution operation;
the parameter substitution function traverses the predefined function to replace the function name portion in the string.
4. A python-based automatic interface test result generation method according to claim 3, wherein:
the parameter replacement function traverses the predefined function, and the process of replacing the function name part in the character string comprises the following steps:
traversing a predefined function mapping function by a parameter replacement function, and storing function names and corresponding function objects in the function mapping function;
detecting whether the input character string contains the function name in the function mapping function, if so, taking out the corresponding function object and executing the function;
and replacing the function name part in the original character string with the function execution result.
5. The automatic generating method of the interface test result based on python according to claim 1, wherein the method comprises the following steps:
defining a request sending class and a response processing class, and respectively performing request sending and response checking operations, wherein the process of defining the request sending class and performing the request sending operation comprises the following steps:
according to the parameter types in the test cases, dynamically selecting corresponding parameters to send requests;
after the request is successfully sent, the response result is converted into a Json format and a character string format and is stored respectively.
6. The automatic generating method of the interface test result based on python according to claim 5, wherein the method comprises the following steps:
defining a request sending class and a response processing class, and respectively performing request sending and response checking operations, wherein the process of defining the response processing class and performing the response checking operation comprises the following steps:
defining an analytic response class, defining a constructor in the analytic response class, and receiving two parameters of a check rule and a response result by the constructor;
wherein the verification rule is in a character string format, and the response result is in a Json format;
in the constructor, the verification rules and the response results are saved into the attributes of the object, and the core function execution verification logic is defined.
7. The automatic generating method of the interface test result based on python according to claim 6, wherein the method comprises the following steps:
the process of defining the core function to perform the check logic includes:
splitting a plurality of verification conditions according to a verification rule, wherein each verification condition comprises a key name, an operator and an expected value;
acquiring an actual value from a response result according to the key name, and forming a comparison expression by the actual value, the expected value and the operator;
executing the comparison expression to judge whether the conditions are met, circularly checking each check condition and recording the check result;
and after the verification is finished, judging whether the verification is passed or not according to the result of the verification condition.
8. The automatic generating method of the interface test result based on python according to claim 1, wherein the method comprises the following steps:
the process of writing the test result into the test report and sending the test report by mail comprises the following steps:
reading files under a designated directory, and obtaining a path list of all test case files meeting the conditions;
reading test case data in the test case file, sending a request to acquire interface response, and checking the response;
updating the total use case number and the passing use case number, writing the test result into a test report, and allowing each use case to run concurrently through independent threads to wait for all threads to finish;
and sending a report mail after the execution is finished.
CN202311676046.0A 2023-12-08 2023-12-08 Automatic interface test result generation method based on python Active CN117370217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311676046.0A CN117370217B (en) 2023-12-08 2023-12-08 Automatic interface test result generation method based on python

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311676046.0A CN117370217B (en) 2023-12-08 2023-12-08 Automatic interface test result generation method based on python

Publications (2)

Publication Number Publication Date
CN117370217A true CN117370217A (en) 2024-01-09
CN117370217B CN117370217B (en) 2024-06-14

Family

ID=89398854

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311676046.0A Active CN117370217B (en) 2023-12-08 2023-12-08 Automatic interface test result generation method based on python

Country Status (1)

Country Link
CN (1) CN117370217B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117785723A (en) * 2024-02-27 2024-03-29 四川互慧软件有限公司 Dynamic interface parameter association method and device and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245083A (en) * 2019-06-11 2019-09-17 四川长虹电器股份有限公司 A kind of automatic interface testing method based on python
CN110287119A (en) * 2019-06-28 2019-09-27 深圳市万睿智能科技有限公司 A kind of automatic interface testing method and device based on python
CN110297774A (en) * 2019-07-02 2019-10-01 四川长虹电器股份有限公司 A kind of automatic interface testing method based on python
CN112306861A (en) * 2020-09-27 2021-02-02 泰山信息科技有限公司 Unittest and Jenkins tool-based interface automatic testing system and method
CN114625633A (en) * 2022-01-26 2022-06-14 科大讯飞股份有限公司 Method, system and storage medium for interface testing
CN114741283A (en) * 2022-03-30 2022-07-12 徐工汉云技术股份有限公司 Automatic interface testing method and device based on python design
CN114880239A (en) * 2022-05-31 2022-08-09 成都秦川物联网科技股份有限公司 Interface automation testing framework and method based on data driving
CN115422063A (en) * 2022-09-06 2022-12-02 宁波数益工联科技有限公司 Low-code interface automation system, electronic equipment and storage medium
CN115437906A (en) * 2021-06-02 2022-12-06 腾讯科技(深圳)有限公司 Test method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245083A (en) * 2019-06-11 2019-09-17 四川长虹电器股份有限公司 A kind of automatic interface testing method based on python
CN110287119A (en) * 2019-06-28 2019-09-27 深圳市万睿智能科技有限公司 A kind of automatic interface testing method and device based on python
CN110297774A (en) * 2019-07-02 2019-10-01 四川长虹电器股份有限公司 A kind of automatic interface testing method based on python
CN112306861A (en) * 2020-09-27 2021-02-02 泰山信息科技有限公司 Unittest and Jenkins tool-based interface automatic testing system and method
CN115437906A (en) * 2021-06-02 2022-12-06 腾讯科技(深圳)有限公司 Test method and device
CN114625633A (en) * 2022-01-26 2022-06-14 科大讯飞股份有限公司 Method, system and storage medium for interface testing
CN114741283A (en) * 2022-03-30 2022-07-12 徐工汉云技术股份有限公司 Automatic interface testing method and device based on python design
CN114880239A (en) * 2022-05-31 2022-08-09 成都秦川物联网科技股份有限公司 Interface automation testing framework and method based on data driving
CN115422063A (en) * 2022-09-06 2022-12-02 宁波数益工联科技有限公司 Low-code interface automation system, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117785723A (en) * 2024-02-27 2024-03-29 四川互慧软件有限公司 Dynamic interface parameter association method and device and electronic equipment

Also Published As

Publication number Publication date
CN117370217B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
CN109189684B (en) Python-based automatic interface testing method
CN110716870B (en) Automatic service testing method and device
CN111832236B (en) Chip regression testing method and system, electronic equipment and storage medium
CN108628748B (en) Automatic test management method and automatic test management system
Sneed et al. Wsdltest-a tool for testing web services
US7895575B2 (en) Apparatus and method for generating test driver
CN117370217B (en) Automatic interface test result generation method based on python
CN111324526B (en) Interface test system, method and server
CN112181854B (en) Method, device, equipment and storage medium for generating process automation script
US20020116153A1 (en) Test automation framework
CN111737152B (en) Method and device for realizing WEB automatic test by inputting data through webpage
CN114924737A (en) Battery management system source code integration test method and device and electronic equipment
CN113742215A (en) Method and system for automatically configuring and calling test tool to perform test analysis
CN117632710A (en) Method, device, equipment and storage medium for generating test code
CN117931620A (en) Automatic test method for reducing test technical threshold of intelligent terminal system
Li et al. Towards a practical and effective method for web services test case generation
CN111552648A (en) Automatic verification method and system for application
CN115934559A (en) Testing method of intelligent form testing system
CN113238968A (en) System test method, apparatus, device, medium, and program product
CN111367940A (en) Unified development method of bank transaction report, client and server system
CN114647568A (en) Automatic testing method and device, electronic equipment and readable storage medium
CN111813665A (en) Big data platform interface data testing method and system based on python
US20240160559A1 (en) Automated decoupling of unit tests
CN113806222B (en) Interface test script generation method, device, equipment and storage medium
CN118672925A (en) Unit test code generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant