CN117033234A - Interface testing method, device, equipment and medium - Google Patents

Interface testing method, device, equipment and medium Download PDF

Info

Publication number
CN117033234A
CN117033234A CN202311076484.3A CN202311076484A CN117033234A CN 117033234 A CN117033234 A CN 117033234A CN 202311076484 A CN202311076484 A CN 202311076484A CN 117033234 A CN117033234 A CN 117033234A
Authority
CN
China
Prior art keywords
test case
test
interface
case
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311076484.3A
Other languages
Chinese (zh)
Inventor
李康华
明德
詹楚伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Baolun Electronics Co ltd
Original Assignee
Guangdong Baolun Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Baolun Electronics Co ltd filed Critical Guangdong Baolun Electronics Co ltd
Priority to CN202311076484.3A priority Critical patent/CN117033234A/en
Publication of CN117033234A publication Critical patent/CN117033234A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses an interface testing method, device, equipment and medium, wherein the method comprises the following steps: responding to the interface case execution task, and reading at least one target test case from a pre-constructed test case set; executing the target test case, and determining a test case test result of the target test case; and determining an interface test result according to the test case test result of each target test case. According to the interface test method, the target test cases in the test case set are automatically executed to perform interface test, so that the test efficiency and the flexibility of the interface test are improved.

Description

Interface testing method, device, equipment and medium
Technical Field
The present invention relates to the field of interface testing technologies, and in particular, to an interface testing method, apparatus, device, and medium.
Background
The existing interface test method mainly comprises two modes of manual test and automatic test, wherein the manual test usually uses tools to test and verify a single interface, the automatic test writes a code script according to test cases, and the interface cases can be automatically executed through analysis and coding of the cases. However, the existing test method has poor test effect and low flexibility.
Disclosure of Invention
The invention provides an interface testing method, device, equipment and medium, which are used for improving the testing efficiency and flexibility of interface testing.
According to an aspect of the present invention, there is provided an interface testing method, including:
responding to the interface case execution task, and reading at least one target test case from a pre-constructed test case set;
executing the target test case, and determining a test case test result of the target test case;
and determining an interface test result according to the test case test result of each target test case.
Optionally, on the basis of the above scheme, the construction of the test case set includes:
acquiring interface document data, and analyzing the interface document data to obtain request association parameters and verification rules of the interface document data;
and generating a plurality of test cases based on the request association parameters and the verification rules, and filling each test case into a test case template to obtain a constructed test case set.
Optionally, on the basis of the above solution, the parsing the interface document data to obtain the request parameter and the verification rule of the interface document data includes:
analyzing the interface document data to obtain request association parameters of the interface document data;
the check rule is determined based on a field type of the request association parameter.
Optionally, on the basis of the above scheme, the request association parameter includes at least one of a request method, url, a request header, a request parameter, a parameter type, a response, and an error code.
Optionally, on the basis of the above solution, the executing the target test case, determining a test case test result of the target test case includes:
executing the target test case to obtain an execution result of the target test case;
and determining an expected value of the target test case, and comparing the execution result with the expected value to obtain the test case test result.
Optionally, on the basis of the above solution, the executing the target test case to obtain an execution result of the target test case includes:
and acquiring a global configuration file, and executing the target test case based on the global configuration file to obtain the execution result, wherein the global configuration file comprises at least one of a database connection configuration file, a self-defined result configuration file and a failure retry configuration file.
Optionally, on the basis of the above scheme, the method further includes:
and generating an interface test file based on the interface test result, and sending the interface test file to at least one display platform.
According to another aspect of the present invention, there is provided an interface test apparatus comprising:
the test case reading module is used for responding to the interface case execution task and reading at least one target test case from a pre-constructed test case set;
the case test result acquisition module is used for executing the target test case and determining a test case test result of the target test case;
and the interface test result determining module is used for determining an interface test result according to the test case test result of each target test case.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the interface testing method of any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute an interface testing method according to any one of the embodiments of the present invention.
According to the technical scheme, at least one target test case is read from a pre-built test case set by responding to the task execution of the interface case; executing the target test case, and determining a test case test result of the target test case; and determining an interface test result according to the test case test result of each target test case, and automatically executing the target test cases in the test case set to perform interface test, thereby improving the test efficiency and flexibility of the interface test.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of an interface testing method according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of a fully automatic execution interface test technology architecture according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an interface testing device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flow chart of an interface testing method according to an embodiment of the present invention, where the method may be applied to a case of testing an interface, and the method may be performed by an interface testing device, and the interface testing device may be implemented in a form of hardware and/or software, and the interface testing device may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, responding to the interface case execution task, and reading at least one target test case from a pre-constructed test case set.
In this embodiment, in order to implement automatic execution of the interface test, by pre-constructing a test case set, the test cases in the test case set are circularly read as target test cases, and execution of the target test cases is performed until all the test cases are executed, and a test result of the interface test is obtained. Based on the above, when the task of executing the interface case is detected, any test case is obtained from the pre-constructed test case combination as the target test case.
Optionally, the interface case execution task may be triggered by a time condition and/or a code condition, for example, a trigger time is set, when the set trigger time is reached, the interface case execution task is triggered to read the target test case, and/or a program code condition is set, when the running program accords with the set program code condition, the interface case execution task is triggered to read the target test case.
In one embodiment, the automatic execution of the task performed by the interface use case can be realized by using Jenkins platform, through the information of newly building tasks, configuring code warehouse and the like in Jenkins, and using various modes provided therein for triggering test operation, such as: and (3) timing, code submission and other methods, triggering the automatic running of the program codes, namely triggering the interface use cases to execute tasks.
Based on the scheme, the test case set can be constructed in a manual construction mode or can be constructed automatically. The manual construction can write a plurality of test cases after manually analyzing the interface document to construct a test case set, but the manual construction mode is long in time and low in efficiency, the subsequent interface change needs manual maintenance and optimization of the test cases, and the maintenance cost is high. In order to solve the technical problems, the present embodiment provides a method for automatically constructing a test case set.
In one embodiment of the present invention, the construction of the test case set includes:
acquiring interface document data, and analyzing the interface document data to obtain request association parameters and verification rules of the interface document data;
and generating a plurality of test cases based on the request association parameters and the verification rules, and filling each test case into a test case template to obtain a constructed test case set.
The above embodiment can also be used to construct a test case set by means of the jenkins platform. The method comprises the steps of creating an automatic generation use case task on a jenkins platform, actively triggering execution of the task after development codes are submitted to a gitlab, analyzing the execution codes according to interface document data, converting the interface document data into a set data structure, acquiring request association parameters of an interface based on the data of the set data structure, and deducing a check rule such as type check, filling check, value range check and the like. And then, the acquired request associated parameters are encoded to realize the values and combination of automatic combination parameters, including the methods of full-parameterization traversal combination, key parameter traversal combination, numerical parameter boundary value generation, specific business scene rules and the like, so as to form test cases, and then the test cases are filled into a test case template to obtain a test case set. The full-parameterized traversal combination is to set all parameters in a parameterized manner, and realize full traversal of a parameter space by arranging and combining different values of traversal parameters; the key parameter traversing combination only identifies key parameters of the interface, only traverses different values of the key parameters, and other parameters use default constant values; generating boundary values of a data range, generating boundary values of a definition range of the data range, such as a minimum value, a maximum value, a 0, a maximum and minimum value and the like, for the log type parameter, and segmenting a scene in a coverage range; and identifying related parameter combinations according to the specific service rule combinations, wherein when the parameter A is set to be a, the value range of the parameter B is changed to generate the parameter combination case coverage service rule.
Alternatively, the conversion of the data structure may be implemented by means of an existing library, e.g. the json library, xml library, etc. in python may be used to convert the data into a python dictionary data structure, and the relevant information is obtained as the request association parameter. Wherein the request association parameter may include at least one of a request method, url, a request header, a request parameter, a parameter type, a response, and an error code.
Alternatively, the test case template may be constructed based on Excel, and test case data is filled into the Excel test case template in combination with an openxyl library to form a complete test case set.
In one embodiment, the parsing the interface document data to obtain the request parameters and the verification rule of the interface document data includes:
analyzing the interface document data to obtain request association parameters of the interface document data;
the check rule is determined based on a field type of the request association parameter.
For example, the field-by-field analysis may be performed by traversing the obtained fields of the request association parameters and the data returned by the interface, so as to determine the verification rule according to the field type. Such as: acquiring the type of each field, such as string, integer, and the like, wherein the type definition determines a basic check rule; checking whether the field definition contains a required identifier or not, and determining that a field needs to be filled with non-empty verification; the enumeration type has a set of enum definition values, and whether the enumeration type is in the set needs to be checked; the number, the character string and the like may have range limitations such as max/min and the like, and the check of the corresponding range comparison is determined to be generated; string types may have patterns defining regularities, require generation of regular matching checks, etc.
The test case of the automatic generation interface can be automatically generated and maintained without manual intervention, so that the workload of a tester can be reduced, and the test cost is reduced.
S120, executing the target test case, and determining a test case test result of the target test case.
After at least one target test case is obtained, the obtained target test case may be sequentially executed, or the obtained target test case may be synchronously executed, which is not limited herein.
Execution of the target test case may be based on an application library, a request library, etc. For example, the Excel test case data (i.e., the test case set) can be circularly read by using the openyl library, the integration of the data association parameters of the data is performed by traversing the data, and then the response information is obtained, so as to determine the test case execution result of the target test case.
In one embodiment of the present invention, the executing the target test case, determining a test case test result of the target test case, includes:
executing the target test case to obtain an execution result of the target test case;
and determining an expected value of the target test case, and comparing the execution result with the expected value to obtain the test case test result.
It can be understood that the target test case and the expected value of the target test case are stored in the test case set, based on the target test case, an execution result can be obtained by executing the target test case, and then the execution result is compared with the expected value to obtain the test case test result.
In one specific example, the Excel test case data is circularly read by using an openyl library, the data is traversed to integrate a data request method, url, a request header, an expected value and the like, then an HTTP request is carried out by using a request library to obtain response information (namely an execution result), at the moment, the program carries out assertion comparison by using the response information and the expected value of the case to obtain the test result, finally the test result and assertion details are written into an Excel table to complete a basically complete test whole flow,
on the basis of the scheme, the executing the target test case to obtain an execution result of the target test case comprises the following steps:
and acquiring a global configuration file, and executing the target test case based on the global configuration file to obtain the execution result, wherein the global configuration file comprises at least one of a database connection configuration file, a self-defined result configuration file and a failure retry configuration file.
In order to further ensure the accuracy of the test result, the information such as failure times, test results, database association and the like can be flexibly defined. For example, the database connection configuration, the database connection character string and the account password are set through the configuration file, and the configuration can be read and connection can be established when the test framework is initialized; the user-defined result configuration provides user-defined configuration of success or failure of interface judgment, such as setting a successful status code, a response time threshold value and the like, and realizes user-defined success judgment logic; the method comprises the steps of performing failed retry configuration, setting failed retry times in a configuration file, subtracting one from the residual retry times when a request fails, and recursively calling the user to realize the retry of the designated times; and filtering test cases, screening the executed test cases according to labels or names, realizing case execution of different scenes, and the like.
When the target test case is executed, the execution of the target test force can be performed by combining the global configuration file. Taking the database connection configuration as an example, the data information in the database can be acquired through the database connection configuration, taking the custom result configuration as an example, the judgment result can be determined through the custom configuration, taking the failure retry configuration as an example, and the execution failure times of the target test case can be determined through the failure retry configuration.
S130, determining an interface test result according to the test case test result of each target test case.
And after the execution of all the target test cases is completed, obtaining test case test results of all the target test cases, integrating all the test case test results to obtain interface test results. If the test data and the test case test result are integrated after the test is completed, the statistical information such as the test summary, the successful case number, the failed case number and the like is constructed into a markup language or a rich text, and the interface test result is generated.
On the basis of the scheme, the method further comprises the following steps:
and generating an interface test file based on the interface test result, and sending the interface test file to at least one display platform.
In order to realize a feedback mechanism that test results can be fed back on multiple platforms, the embodiment provides an interface test result display method, which generates interface test files by using the interface test results, and sends the interface test files to the display platform through channels associated with the display platform.
For example, the test results may be pushed to a mailbox or other interactive APP presentation. If an email module is imported in the test framework, connecting with a mailbox server through an SMTP protocol, constructing mail content, and sending a test report as an attachment; and acquiring webhook by using an APP developer tool, adding a code for sending a message at the end of the test script, calling a webhook API of the APP to send a test result message and the like. The method provided by the embodiment can be used for carrying out multi-platform and multi-mode frontal integration display report, and solving the report presentation requirements under different scenes.
According to the technical scheme, at least one target test case is read from a pre-built test case set by responding to the task execution of the interface case; executing the target test case, and determining a test case test result of the target test case; and determining an interface test result according to the test case test result of each target test case, and automatically executing the target test cases in the test case set to perform interface test, thereby improving the test efficiency and flexibility of the interface test.
Example two
Fig. 2 is a schematic structural diagram of a fully automatic execution interface test technology architecture according to a second embodiment of the present invention, and this embodiment provides a preferred embodiment based on the foregoing embodiment. As shown in fig. 2, a fully automatic execution interface test technology architecture is schematically shown.
The method is characterized in that the method comprises the steps of integrating a program into a Jenkins platform, automatically analyzing an interface document through a self-developed analyzer, acquiring related information, combining to generate test data, and inputting the test data into an Excel test case template to form a complete test case set. When the execution routine is run, each row of use case parameters are read from the Excel use case table and requested through the execution code, the execution is sequentially carried out until all the use case data are executed, a test result is generated, and finally the test result is sent to a corresponding display platform and is notified to related personnel.
The main two-part task performed as a whole is described in detail as follows:
task one: the automatic generation and maintenance of test cases mainly comprises
1. Creating an automatic generation use case task on a jenkins platform, and actively triggering execution of the task after development code is submitted to a gitlab
2. The execution code analyzes according to the interface document, converts the data into a python dictionary data structure by using a json library, an xml library and the like in python, acquires related information such as a request method, url, a request header, a request parameter, a parameter type, a response, an error code and the like of the interface, and simultaneously deduces various check rules including type check, filling check, value range check and the like
3. The obtained interface parameters are encoded to realize the automatic combination of the values and combinations of the parameters, including the methods of full parameterization traversal combination, key parameter traversal combination, numerical parameter boundary value generation, specific business scene rules and the like, to form test case data,
4. filling test case data into an Excel test case template by combining an openxyl library to form a complete test case set
The technical means for automatically generating and realizing the interface use cases are as follows:
1. and analyzing the document by utilizing a plurality of analysis libraries, including Swagger, json, HTML, XML combined extraction, and related information such as a request method, url, a request header, a request parameter, a parameter type, response, an error code and the like of the interface.
2. The verification rule can be determined by analyzing the fields one by one through traversing the obtained fields and the data returned by the interface.
3. The combination of traversals of parameters may be performed in combination.
Task two: the automatic execution interface use case is tested and mainly comprises the following steps:
1. and creating tasks of the automatic execution use cases in the jenkins platform, and executing the tasks periodically and regularly.
2. And circularly reading Excel test case data by using an openxyl library, integrating a data request method, a url, a request header, an expected value and the like by traversing the data, performing HTTP (hyper text transport protocol) request by using a request library to obtain response information, performing assertion comparison between the response information and an assertion expected value of a case by a program to obtain a test result, and finally writing the test result and assertion details into an Excel table to complete a basically complete test whole flow.
3. The flexible mechanisms such as the number of times of self-defining failed re-running and the self-defining test result through the configuration file are that after the step 3, the global configuration file is read, including database connection configuration, self-defining result configuration, failed re-try configuration and the like.
4. The method and the device relate to operations such as query matching of a database, can perform connection query through target sql sentences in an Excel table, and accurately judge the accuracy of data by combining the results of assertion comparison.
5. When the test result needs to be fed back by multiple platforms, all the test data are saved in a file, wherein the file comprises a request and response data during the test, a log of a failure result, a failure error code and the like, and the data are integrated and then sent to each platform in step 11.
The technical means for realizing the full-automatic execution interface test comprises the following steps:
1. the automatic execution interface uses Jenkins platform to continuously integrate, and uses various modes provided in the automatic execution interface for triggering test operation, such as: methods such as timing, code submission and the like, and triggering automatic running of program codes
2. The number of times of failed re-running is customized through the configuration file, and the test result is customized, so that flexible definition can be realized.
3. And integrating the test data and the result after the test is completed, and feeding back the test result of the full-flow platform.
According to the technical scheme, 1, the interface test cases can be automatically generated and maintained without manual intervention, the workload of testers can be reduced, the test cost is reduced, the interface test can be automatically executed, the test cases can be rapidly executed and test reports can be generated, the test efficiency is greatly improved, failure times, test results and the like can be flexibly defined, the accuracy of the test results is ensured, the multi-platform multi-mode integration display report is realized, and the report presentation requirements under different scenes are solved.
Example III
Fig. 3 is a schematic structural diagram of an interface testing device according to a third embodiment of the present invention. As shown in fig. 3, the apparatus includes:
the test case reading module 310 is configured to read at least one target test case from a pre-built test case set in response to the interface case executing task;
the case test result obtaining module 320 is configured to execute the target test case, and determine a test case test result of the target test case;
the interface test result determining module 330 is configured to determine an interface test result according to the test case test result of each of the target test cases.
According to the technical scheme, at least one target test case is read from a pre-built test case set by responding to the task execution of the interface case; executing the target test case, and determining a test case test result of the target test case; and determining an interface test result according to the test case test result of each target test case, and automatically executing the target test cases in the test case set to perform interface test, thereby improving the test efficiency and flexibility of the interface test.
Optionally, on the basis of the above scheme, the device further includes a test case set construction module, configured to:
acquiring interface document data, and analyzing the interface document data to obtain request association parameters and verification rules of the interface document data;
and generating a plurality of test cases based on the request association parameters and the verification rules, and filling each test case into a test case template to obtain a constructed test case set.
Optionally, based on the above scheme, the test case set construction module is specifically configured to:
analyzing the interface document data to obtain request association parameters of the interface document data;
the check rule is determined based on a field type of the request association parameter.
Optionally, on the basis of the above scheme, the request association parameter includes at least one of a request method, url, a request header, a request parameter, a parameter type, a response, and an error code.
Optionally, based on the above scheme, the use case test result obtaining module 320 is specifically configured to:
executing the target test case to obtain an execution result of the target test case;
and determining an expected value of the target test case, and comparing the execution result with the expected value to obtain the test case test result.
Optionally, based on the above scheme, the use case test result obtaining module 320 is specifically configured to:
and acquiring a global configuration file, and executing the target test case based on the global configuration file to obtain the execution result, wherein the global configuration file comprises at least one of a database connection configuration file, a self-defined result configuration file and a failure retry configuration file.
Optionally, on the basis of the above scheme, the device further includes a test result display module, configured to:
and generating an interface test file based on the interface test result, and sending the interface test file to at least one display platform.
The interface testing device provided by the embodiment of the invention can execute the interface testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the interface test method.
In some embodiments, the interface testing method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the interface test method described above may be performed. Alternatively, in other embodiments, processor 11 may be configured to perform the interface test method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
The computer program used to implement the interface test method of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
Example five
The fifth embodiment of the present invention also provides a computer readable storage medium, where computer instructions are stored, where the computer instructions are configured to cause a processor to perform an interface testing method, where the method includes:
responding to the interface case execution task, and reading at least one target test case from a pre-constructed test case set;
executing the target test case, and determining a test case test result of the target test case;
and determining an interface test result according to the test case test result of each target test case.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. An interface testing method, comprising:
responding to the interface case execution task, and reading at least one target test case from a pre-constructed test case set;
executing the target test case, and determining a test case test result of the target test case;
and determining an interface test result according to the test case test result of each target test case.
2. The method of claim 1, wherein the constructing of the set of test cases comprises:
acquiring interface document data, and analyzing the interface document data to obtain request association parameters and verification rules of the interface document data;
and generating a plurality of test cases based on the request association parameters and the verification rules, and filling each test case into a test case template to obtain a constructed test case set.
3. The method according to claim 2, wherein said parsing the interface document data to obtain request parameters and verification rules of the interface document data comprises:
analyzing the interface document data to obtain request association parameters of the interface document data;
the check rule is determined based on a field type of the request association parameter.
4. The method of claim 2, wherein the request-associated parameters include at least one of a request method, url, request header, request parameters, parameter type, response, error code.
5. The method of claim 1, wherein the executing the target test case, determining a test case test result for the target test case, comprises:
executing the target test case to obtain an execution result of the target test case;
and determining an expected value of the target test case, and comparing the execution result with the expected value to obtain the test case test result.
6. The method of claim 5, wherein the executing the target test case to obtain an execution result of the target test case comprises:
and acquiring a global configuration file, and executing the target test case based on the global configuration file to obtain the execution result, wherein the global configuration file comprises at least one of a database connection configuration file, a self-defined result configuration file and a failure retry configuration file.
7. The method as recited in claim 1, further comprising:
and generating an interface test file based on the interface test result, and sending the interface test file to at least one display platform.
8. An interface testing apparatus, comprising:
the test case reading module is used for responding to the interface case execution task and reading at least one target test case from a pre-constructed test case set;
the case test result acquisition module is used for executing the target test case and determining a test case test result of the target test case;
and the interface test result determining module is used for determining an interface test result according to the test case test result of each target test case.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the interface testing method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the interface testing method of any one of claims 1-7.
CN202311076484.3A 2023-08-24 2023-08-24 Interface testing method, device, equipment and medium Pending CN117033234A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311076484.3A CN117033234A (en) 2023-08-24 2023-08-24 Interface testing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311076484.3A CN117033234A (en) 2023-08-24 2023-08-24 Interface testing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117033234A true CN117033234A (en) 2023-11-10

Family

ID=88622662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311076484.3A Pending CN117033234A (en) 2023-08-24 2023-08-24 Interface testing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117033234A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134584A (en) * 2019-04-12 2019-08-16 深圳壹账通智能科技有限公司 A kind of generation method, device, storage medium and the server of interface testing use-case
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method
CN111949537A (en) * 2020-08-14 2020-11-17 北京锐安科技有限公司 Interface test method, device, equipment and medium
CN112948233A (en) * 2020-07-30 2021-06-11 深圳市明源云链互联网科技有限公司 Interface testing method, device, terminal equipment and medium
CN113342679A (en) * 2021-06-29 2021-09-03 汇付天下有限公司 Interface test method and test device
CN114816993A (en) * 2022-03-22 2022-07-29 百安居信息技术(上海)有限公司 Full link interface test method, system, medium and electronic equipment
CN114817024A (en) * 2022-04-21 2022-07-29 深圳市商汤科技有限公司 Use case generation method and device, equipment and storage medium
CN115017047A (en) * 2022-06-06 2022-09-06 中邮信息科技(北京)有限公司 Test method, system, equipment and medium based on B/S architecture
CN116107885A (en) * 2023-01-06 2023-05-12 济南浪潮数据技术有限公司 Interface testing method, device, equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134584A (en) * 2019-04-12 2019-08-16 深圳壹账通智能科技有限公司 A kind of generation method, device, storage medium and the server of interface testing use-case
CN111858376A (en) * 2020-07-29 2020-10-30 平安养老保险股份有限公司 Request message generation method and interface test method
CN112948233A (en) * 2020-07-30 2021-06-11 深圳市明源云链互联网科技有限公司 Interface testing method, device, terminal equipment and medium
CN111949537A (en) * 2020-08-14 2020-11-17 北京锐安科技有限公司 Interface test method, device, equipment and medium
CN113342679A (en) * 2021-06-29 2021-09-03 汇付天下有限公司 Interface test method and test device
CN114816993A (en) * 2022-03-22 2022-07-29 百安居信息技术(上海)有限公司 Full link interface test method, system, medium and electronic equipment
CN114817024A (en) * 2022-04-21 2022-07-29 深圳市商汤科技有限公司 Use case generation method and device, equipment and storage medium
CN115017047A (en) * 2022-06-06 2022-09-06 中邮信息科技(北京)有限公司 Test method, system, equipment and medium based on B/S architecture
CN116107885A (en) * 2023-01-06 2023-05-12 济南浪潮数据技术有限公司 Interface testing method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
唐文编著: "《Python自动化测试入门与进阶实战》", 28 February 2021, 北京:机械工业出版社, pages: 213 - 222 *
柠檬班: "《企业级自动化测试从入门到精通 Python版》", 31 July 2022, 天津:天津科学技术出版社 , pages: 11 - 13 *

Similar Documents

Publication Publication Date Title
EP3859533A2 (en) Method and apparatus for testing map service, electronic device, storage medium and computer program product
JP7289334B2 (en) Methods and apparatus, electronic devices, storage media and computer programs for testing code
CN108628748B (en) Automatic test management method and automatic test management system
CN110908890A (en) Automatic test method and device for interface
CN113657088A (en) Interface document analysis method and device, electronic equipment and storage medium
CN113127357A (en) Unit testing method, device, equipment, storage medium and program product
CN108959508B (en) SQL data generation method and device
CN117724980A (en) Method and device for testing software framework performance, electronic equipment and storage medium
CN116303013A (en) Source code analysis method, device, electronic equipment and storage medium
CN117033234A (en) Interface testing method, device, equipment and medium
CN115934550A (en) Test method, test device, electronic equipment and storage medium
CN115600038A (en) Page rendering method, device, equipment and medium
CN115017047A (en) Test method, system, equipment and medium based on B/S architecture
CN114003497A (en) Method, device and equipment for testing service system and storage medium
CN115361290B (en) Configuration comparison method, device, electronic equipment and storage medium
CN114238149A (en) Batch testing method of accounting system, electronic device and storage medium
CN118133794B (en) Table configuration method, apparatus, device and storage medium
CN113742225B (en) Test data generation method, device, equipment and storage medium
CN117493203A (en) Method, device, equipment and storage medium for testing server software
CN117931155A (en) Code generation method, device, equipment and medium
CN117056222A (en) Interface test file generation method and device, electronic equipment and storage medium
CN117670236A (en) Mobile-terminal-based to-be-handled flow approval method, device, equipment and medium
CN116562242A (en) Data report construction method and device, electronic equipment and storage medium
CN116578555A (en) Data verification method, system, electronic equipment and storage medium
CN118051439A (en) API automatic test case generation method and device, electronic equipment and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination