CN113238940A - Interface test result comparison method, device, equipment and storage medium - Google Patents
Interface test result comparison method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN113238940A CN113238940A CN202110518099.4A CN202110518099A CN113238940A CN 113238940 A CN113238940 A CN 113238940A CN 202110518099 A CN202110518099 A CN 202110518099A CN 113238940 A CN113238940 A CN 113238940A
- Authority
- CN
- China
- Prior art keywords
- test environment
- interface
- test
- result
- comparison
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The embodiment of the invention discloses a method, a device, equipment and a storage medium for comparing interface test results, which relate to the field of automatic program design and comprise the following steps: acquiring to-be-tested data through bottom-laying data of a first testing environment according to the preposed screening conditions of each interface automatic testing case in the interface automatic testing case library, and constructing an interface request message according to the to-be-tested data and the interface automatic testing case; and according to the first test environment and the second test environment, acquiring an interface test comparison result of the first test environment and the second test environment aiming at a response result of the interface request message. According to the technical scheme provided by the embodiment of the invention, the automatic interface test and the comparison of the interface test results are realized, the comparison efficiency of the interface test results is improved, and meanwhile, the refined comparison of the interface test results is realized based on the comparison mode of the response message and the database record.
Description
Technical Field
The embodiment of the invention relates to the field of automatic program design, in particular to a method, a device, equipment and a storage medium for comparing interface test results.
Background
In the process of software development, due to architecture change or system performance improvement, such as table-based and library-based sharing, distributed transformation and the like, function reconfiguration needs to be performed on background services, at this time, the service functions exposed to the outside by the new and old systems are basically unchanged, and the influence generated by logic change is mainly reflected in the interface position of the system, so that interface testing performed on the new and old systems and comparison of test results become especially important.
In the prior art, for interface test of new and old versions of a software system, transaction flow or messages are generally intercepted in a centralized manner, after simple processing, transactions are initiated in batches through an interface test tool, and then scripts are compiled additionally to compare and verify execution results in a centralized manner.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a storage medium for comparing interface test results, which are used for comparing the interface test results in an original system environment and an iterative system environment by utilizing the existing automatic test assets.
In a first aspect, an embodiment of the present invention provides a method for comparing interface test results, including:
acquiring to-be-tested data through bottom-laying data of a first testing environment according to the pre-screening conditions of each interface automatic testing case in the interface automatic testing case library, and constructing an interface request message according to the to-be-tested data and the interface automatic testing case;
sending the interface request message to a first test environment and a second test environment respectively, and acquiring a response result of the interface request message in the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
and according to the first test environment and the second test environment, aiming at the response result of the interface request message, acquiring an interface test comparison result of the first test environment and the second test environment.
In a second aspect, an embodiment of the present invention provides an interface test result comparison apparatus, including:
the interface request message acquisition module is used for acquiring to-be-tested data through the bottom-laying data of the first test environment according to the preposed screening conditions of the automatic test cases of each interface in the automatic test case library of the interfaces and constructing an interface request message according to the to-be-tested data and the automatic test cases of the interfaces;
a response result obtaining module, configured to send the interface request packet to a first test environment and a second test environment respectively, and obtain a response result of the interface request packet for the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
and the comparison execution module is used for acquiring the interface test comparison result of the first test environment and the second test environment according to the first test environment and the second test environment and aiming at the response result of the interface request message.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors implement the method for comparing interface test results according to any embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a storage medium containing computer-executable instructions, where the computer-executable instructions, when executed by a computer processor, implement the method for comparing interface test results according to any embodiment of the present invention.
According to the technical scheme provided by the embodiment of the invention, the interface request message is constructed through the bottoming data of the first test environment and the existing interface automatic test case, the interface request message is further respectively sent to the first test environment and the second test environment, the first test environment and the second test environment are obtained, the interface test comparison result of the first test environment and the second test environment is finally obtained according to the response result of the interface request message, the interface automatic test and the comparison of the interface test results are realized, the time cost consumed by compiling the test script is saved, the labor cost consumed by manual inspection is saved, the comparison efficiency of the interface test results is improved, and meanwhile, the refined comparison of the interface test results is realized based on the comparison mode of the response message and the database record.
Drawings
Fig. 1A is a flowchart of a method for comparing interface test results according to an embodiment of the present invention;
FIG. 1B is a flowchart illustrating an interface test according to an embodiment of the present invention;
FIG. 1C is a schematic diagram of an association path of a database according to an embodiment of the present invention;
fig. 2 is a block diagram of a structure of an interface test result comparison apparatus according to a second embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1A is a flowchart of a method for comparing interface test results according to an embodiment of the present invention, where the embodiment is applicable to comparison between an original system environment and an iterative system environment, and the comparison between the interface test results, especially comparison between an original financial transaction system that stably runs online and an iterative financial transaction system that is to be online, and the comparison between the interface test results according to the embodiment of the present invention may be performed by a device for comparing interface test results according to the embodiment of the present invention, where the device may be implemented by software and/or hardware and may be integrated in an electronic device, and typically may be integrated in a server that simultaneously runs the original financial transaction system that stably runs online and the iterative financial transaction system that is to be online, where the method specifically includes the following steps:
s110, acquiring data to be tested through the bottom-laying data of the first testing environment according to the pre-screening conditions of the automatic testing cases of each interface in the automatic testing case library of the interfaces, and constructing an interface request message according to the data to be tested and the automatic testing cases of the interfaces.
The interface automation test case library is a case set which is constructed in advance and comprises a plurality of interface automation test cases, each interface automation test case comprises complete business execution logic, for example, one interface automation test case is used for executing 1000-element transfer operation, namely the interface automation test case comprises the execution logic of transaction business corresponding to the 1000-element transfer; each interface automatic test case can also comprise description information such as case name, transaction code, case property, case type, case summary, case label and the like, and information such as pre-screening condition, post-screening condition and/or assertion check point and the like; the pre-screening condition and the post-screening condition are set screening conditions when pre-processing and post-processing are respectively carried out on the interface automation test case; according to the prepositive screening conditions of the automatic testing cases of the interfaces, the automatic testing cases of the interfaces can extract data to be tested from the bottom laying data in a targeted manner, so that the success rate of the automatic testing of the interfaces is improved; the pre-screening condition includes account property, account type, account balance and/or contract status. Specifically, for the interface automation test case as an example, the interface automation test case performs 1000-yuan account transfer operation, that is, 1000 yuan account needs to be deducted from the request initiating account, and the pre-screening condition is defined as an account with a screening amount greater than or equal to 1000 yuan, so that the condition that the transaction request fails due to account balance less than 1000 yuan does not occur when the transfer transaction is performed.
The bottoming data is data obtained by introducing production environment data of a tested system into a testing environment after operations such as screening, desensitization and the like, and adding part of business and technical parameters, and aims to ensure the diversity of covering data scenes in the testing environment; in the embodiment of the invention, the first test environment is an original system environment which stably runs on line, and the second test environment is a new system environment which is to be on line and is iterated; in both test environments, the same or similar bedding data was available. According to the pre-screening condition of the interface automatic test case, after the data to be tested is extracted from the bottom data of the first test environment, the data to be tested is filled into the interface automatic test case as the value (namely the actual value) of the message input field, and the filled interface automatic test case not only has complete business execution logic, but also has the value of the message input field in the case as the actual value in the tested system, namely the actual information in the tested system, for example, the value of the message input field is the actual account name in the tested system.
Optionally, in the embodiment of the present invention, the constructing an interface request packet according to the to-be-tested data and the automated test case includes: and acquiring a message value of a message input field according to the to-be-tested data and the automatic test case, and constructing an interface request message by carrying out inverse parameterization on the message value. The value of the message input field in the interface automation test case before filling may be in a parameterized form, and is a complex mathematical expression obtained by parameterizing the value of the message input field, after the data to be tested is obtained, the mathematical expression in the interface automation test case needs to be replaced by the data to be tested, that is, an inverse parameterization operation is performed, so as to obtain the interface automation test case after value replacement.
Compared with the traditional technical scheme, the interface test cases use the transaction flow information intercepted from the system logs as the request message, and the transaction flow information does not contain any description information, so that the service scenes of the cases cannot be accurately carried out, the number of the interface test cases is too large, the test scene repetition degree is high, the test time is often long, and the pertinence of the test scene is not strong; in the embodiment of the invention, the test cases of different test scenes are selected from the existing automatic interface test case asset library through the preposed screening condition to form an effective case set for test execution, so that the problems of overlarge test case data volume, weak pertinence and high test scene repetition degree are solved; meanwhile, because all the transaction flow information is fixed data, depending on the current system environment, the execution success rate of the interface request message in the system is low, for example, when the transaction request is repeatedly executed for the second time, the account may be insufficient in balance of 1000 yuan after deducting 1000 yuan for the first execution, so that when the transaction request is repeatedly executed for the subsequent time, the execution is not successful, and the request is failed; in the embodiment of the invention, the interface test case can be dynamically acquired through the preposed screening condition, so that invalid input inconsistent with bottoming data in a system environment can be effectively eliminated, and the execution success rate is obviously improved.
S120, sending the interface request message to a first test environment and a second test environment respectively, and acquiring a response result of the first test environment and the second test environment aiming at the interface request message; wherein, the response result comprises a response message and a database record.
The response message is an output message of the test environment for the interface request message, the database record is a data record in a concerned and specific database, and the data record is a row of information in a certain data table in the database after data change operation, such as adding modification and the like, is performed in the data table of the database for the interface request message by the test environment, and the information is composed of a plurality of field values and represents a group of complete related information. Specifically, after the interface request message is obtained, the interface request message may be simultaneously sent to the first test environment and the second test environment, so as to ensure that the same interface request message is used in the two test environments, and improve the execution efficiency of the interface request message in the two test environments.
Optionally, in this embodiment of the present invention, the sending the interface request packet to a first test environment and a second test environment respectively, and obtaining a response result of the interface request packet in the first test environment and the second test environment includes: sending the interface request message to a first test environment, acquiring a response result of the first test environment for the interface request message, and judging whether an assertion result of the first test environment is correct or not according to the response result of the first test environment; and if the assertion result of the first test environment is judged to be correct, sending the interface request message to a second test environment, and acquiring a response result of the second test environment for the interface request message. Assertion is a checking process, which determines whether an expression value at a certain point in a program code is true, and aims to ensure the stability of a system environment and the accuracy of the code; whether the message value of the message input field in the interface request message is correct can be determined through assertion.
For example, as shown in fig. 1B, after the interface request message is responded to the first test environment and recorded in the database, corresponding assertion check may be performed through pre-processing, a message input field, and post-processing of the interface request message, for example, for a single transfer transaction, in the pre-processing of the interface request message, an account balance before the transfer transaction request is sent is queried, a message value in the message input field includes a transfer value (i.e., an occurrence amount), in the post-processing, an account balance recorded in the database after the transfer transaction request is responded to is queried, and whether the message value in the transfer transaction request is correct is determined according to an association relationship between the three (i.e., whether the transfer value subtracted from the transfer account balance before the transfer is equal to the account balance after the transfer). If the assertion result in the first test environment is correct, the current interface test case is indicated to pass the detection of the first test environment (namely the message value of the message input field of the interface test case is correct), and the interface test case can be continuously used in the second test environment; if the result is wrongly asserted in the first test environment, the interface test case is indicated to have a problem, and the problem does not need to be continuously sent to the second test environment, and the first alarm prompt is sent out at the moment, so that the problem that the interface test case is continuously executed in the second test environment when the current interface test case has the problem, and the resource waste of the second test environment is avoided.
Optionally, in this embodiment of the present invention, after sending the interface request packet to a second test environment and obtaining a response result of the second test environment for the interface request packet, the method further includes: judging whether the assertion result of the second test environment is correct or not according to the response result of the second test environment; the obtaining, according to the first test environment and the second test environment, an interface test comparison result of the first test environment and the second test environment with respect to a response result of the interface request packet includes: and if the assertion result of the second test environment is judged to be correct, acquiring an interface test comparison result of the first test environment and the second test environment according to the first test environment and the second test environment and aiming at a response result of the interface request message. If the assertion result in the second test environment is also correct, the interface test case is proved to pass the detection of the first test environment and the second test environment, and the interface test case has a comparison basis, namely, in the two system test environments, the detection of the message value of the message input field in the system is correct; if the assertion of the second test environment is judged to be wrong, the current interface test case is correct, but a code Bug exists in the second test environment, or the bottoming data of the second test environment is inconsistent with that of the first test environment, and accordingly, a second alarm prompt is sent out, the alarm content of the second alarm prompt is obviously different from that of the first alarm prompt, the first alarm prompt is related to the error of the current interface test case, and the second alarm prompt is related to the error of the second test environment.
Optionally, in this embodiment of the present invention, the obtaining the response result of the interface request packet for the first test environment and the second test environment includes: the matching database instance is determined based on the context identifier and/or the database name. In the post-processing of the interface test, the associated path of the database record, that is, the access path of the database, has been defined in advance, if the first test environment and the second test environment have the same database structure, then the first test environment and the second test environment should have the same database access path, but actually, in the upgrade modification of the system environment, the database structure may be adjusted, taking fig. 1C as an example, the database in the first test environment is a single database, but in the second test environment, the database is divided, the new system environment (that is, the second test environment) after the adjustment of the database structure is a database division structure, the storage location of the data table in the second test environment is different from that in the first test environment, therefore, the access path of the database record in the database is obviously different, and therefore, the system environment identification (that is, the identification for distinguishing the first test environment from the second test environment) can be passed through, or the name of the database to be accessed, and the database instance required by connection; in particular, the records of the test environment in the database may correspond to a plurality of data tables, that is, after the test environment responds to the interface request message, a plurality of actions may need to be performed on the data tables in the database, for example, after responding to the account transfer transaction request, the balance of the account needs to be deducted, and the available amount of the account needs to be changed, so that respective actions need to be performed on different data tables.
S130, according to the first test environment and the second test environment, aiming at the response result of the interface request message, obtaining an interface test comparison result of the first test environment and the second test environment.
After the response results of the current interface request message (i.e., the response results of the current interface test case) are obtained in the first test environment and the second test environment, the difference between the response results of the current interface test case can be compared, and the difference can be further searched.
In the prior art, the result comparison of the database records is usually performed after the case batch execution, and there is a high possibility that the execution results of a plurality of cases are superimposed on the same database record, and the obtained interface test result has a large error.
Optionally, in this embodiment of the present invention, the obtaining, according to the first test environment and the second test environment, an interface test comparison result between the first test environment and the second test environment with respect to a response result of the interface request packet includes: acquiring a response message comparison table according to the response message of the first test environment aiming at the interface request message and the response message of the second test environment aiming at the interface request message; and acquiring a database record comparison table according to the database record of the first test environment aiming at the interface request message and the database record of the second test environment aiming at the interface request message. In order to show the clear difference comparison result to the user, the interface test comparison result of each interface test case can be recorded and shown to the user by obtaining the response result sub-table respectively matched with the response message and the database record according to the response result.
Specifically, as shown in table 1, the response messages in the two test environments are divided into fields according to the messages, and the output results are compared; taking the response result in the first test environment as a reference round (corresponding to a reference field), and taking the response result in the second test environment as a comparison round (corresponding to a comparison field); the response message comparison table comprises items (such as a field 1) with the same field name and the same value, items (such as a field 2) with the same field name and different values, items (such as a field 4 and a field 5) with different field names and the like, and the items are distinguished by different colors, and only different parts can be displayed through screening; in particular, in order to exclude the noise field that does not need to be compared, it can be marked that the field does not need to be compared by a first predetermined meaningless field, for example, a timestamp (execution time in two system environments) field, and other fields except the first predetermined meaningless field need to be compared.
TABLE 1
As shown in table 2, the database records in the two test environments disassemble the fields according to the execution actions, and compare the output results; particularly, for an interface request message, there may be multiple execution actions in the database, for example, in the above technical solution, after responding to the account transfer transaction request, the balance of the account needs to be deducted, and the available amount of the account needs to be changed, so that the two actions need to be executed with respect to the database; taking the response result in the first test environment as a reference round (corresponding to a reference field) and taking the response result in the second test environment as a comparison round (corresponding to a comparison field); the database record comparison table comprises items (such as a field 1 and a field 4) with the same field name and the same value, items (such as a field 2) with the same field name and different values, and items (such as a field 5) with different field names, and the like, and the items are distinguished by different colors, and only different parts can be displayed through screening; in particular, in order to exclude the noise field that does not need to be compared, it can be marked that the field does not need to be compared by a second predetermined meaningless field, for example, a timestamp (execution time in two system environments) field, and the fields other than the second predetermined meaningless field need to be compared.
TABLE 2
Optionally, in this embodiment of the present invention, the obtaining, according to the first test environment and the second test environment, an interface test comparison result between the first test environment and the second test environment with respect to a response result of the interface request packet includes: obtaining a response result comparison table according to the response message comparison table and the database record comparison table; and the response result comparison table comprises a response message comparison abstract and/or a database record comparison abstract. As shown in table 3, one case corresponds to one interface message request, if the assertion check of the first test environment is passed, the reference round execution result is marked as successful, and if the assertion check of the second test environment is passed, the comparison round execution result is marked as successful; if one interface test case is in different test environments, the comparison results of the response messages are completely consistent, the comparison results recorded by the database are completely consistent, the comparison results are marked as consistent, and if not, the comparison results are marked as inconsistent; the abstract is compared with the response message, and inconsistent content items in the comparison table of the response message are marked; the database record comparison abstract is marked with inconsistent content items in the database record comparison table; particularly, the response message comparison abstract and the database record comparison abstract can display all inconsistent content items, and also can display a specified number of inconsistent content items according to a preset number, for example, only the first 3 difference comparison results are displayed in the response message comparison abstract and the database record comparison abstract.
TABLE 3
According to the technical scheme provided by the embodiment of the invention, the interface request message is constructed through the bottoming data of the first test environment and the existing interface automatic test case, the interface request message is further respectively sent to the first test environment and the second test environment, the first test environment and the second test environment are obtained, the interface test comparison result of the first test environment and the second test environment is finally obtained according to the response result of the interface request message, the interface automatic test and the comparison of the interface test results are realized, the time cost consumed by compiling the test script is saved, the labor cost consumed by manual inspection is saved, the comparison efficiency of the interface test results is improved, and meanwhile, the refined comparison of the interface test results is realized based on the comparison mode of the response message and the database record.
Example two
Fig. 2 is a block diagram of a structure of an interface test result comparison apparatus according to a second embodiment of the present invention, where the apparatus specifically includes:
an interface request message obtaining module 210, configured to obtain to-be-tested data through the bottoming data of the first testing environment according to the pre-screening condition of each interface automation testing case in the interface automation testing case library, and construct an interface request message according to the to-be-tested data and the interface automation testing case;
a response result obtaining module 220, configured to send the interface request packet to a first test environment and a second test environment respectively, and obtain a response result of the interface request packet for the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
the comparison executing module 230 is configured to, according to the first test environment and the second test environment, obtain an interface test comparison result of the first test environment and the second test environment for a response result of the interface request packet.
According to the technical scheme provided by the embodiment of the invention, the interface request message is constructed through the bottoming data of the first test environment and the existing interface automatic test case, the interface request message is further respectively sent to the first test environment and the second test environment, the first test environment and the second test environment are obtained, the interface test comparison result of the first test environment and the second test environment is finally obtained according to the response result of the interface request message, the interface automatic test and the comparison of the interface test results are realized, the time cost consumed by compiling the test script is saved, the labor cost consumed by manual inspection is saved, the comparison efficiency of the interface test results is improved, and meanwhile, the refined comparison of the interface test results is realized based on the comparison mode of the response message and the database record.
Optionally, on the basis of the above technical solution, the pre-screening condition includes an account property, an account type, an account balance, and/or a contract status.
Optionally, on the basis of the above technical solution, the interface request message obtaining module 210 is further specifically configured to perform an inverse parameterization operation on the interface automation test case according to the data to be tested, so as to construct an interface request message.
Optionally, on the basis of the foregoing technical solution, the response result obtaining module 220 specifically includes:
the first assertion result judging unit is used for sending the interface request message to a first test environment, acquiring a response result of the first test environment for the interface request message, and judging whether the assertion result of the first test environment is correct or not according to the response result of the first test environment;
and the response result acquisition unit is used for sending the interface request message to a second test environment and acquiring a response result of the second test environment for the interface request message if the assertion result of the first test environment is judged to be correct.
Optionally, on the basis of the above technical solution, the device for comparing interface test results further includes:
and the first alarm prompt sending module is used for sending a first alarm prompt if the assertion result of the first test environment is judged to be wrong.
Optionally, on the basis of the foregoing technical solution, the response result obtaining module 220 further includes:
the second assertion result judging unit is used for judging whether the assertion result of the second test environment is correct or not according to the response result of the second test environment;
optionally, on the basis of the above technical solution, the comparison executing module is specifically configured to, if the assertion result of the second test environment is determined to be correct, obtain, according to the first test environment and the second test environment, an interface test comparison result between the first test environment and the second test environment in response to the response result of the interface request packet.
Optionally, on the basis of the above technical solution, the device for comparing interface test results further includes:
the second alarm prompt sending module is used for sending a second alarm prompt if the assertion result of the second test environment is judged to be wrong; wherein the alarm content of the second alarm prompt is different from the alarm content of the first alarm prompt.
Optionally, on the basis of the foregoing technical solution, the response result obtaining module 220 specifically includes:
and the matching execution unit is used for really matching the database instances according to the environment identification.
Optionally, on the basis of the foregoing technical solution, the comparison executing module 230 includes:
a response message comparison table obtaining unit, configured to obtain a response message comparison table according to a response message of the first test environment for the interface request message and a response message of the second test environment for the interface request message;
and the database record comparison table acquiring unit is used for acquiring a database record comparison table according to the database record of the first test environment aiming at the interface request message and the database record of the second test environment aiming at the interface request message.
Optionally, on the basis of the above technical solution, the response packet comparison table includes a first preset meaningless field; and/or the data record alignment table includes a second predetermined meaningless field.
Optionally, on the basis of the above technical solution, the response packet comparison table includes entries with the same field name and the same value, entries with the same field name and different values, and entries with different field names; and/or the data record comparison table comprises the same field name and same value item, the same field name and different value item and different field name item.
Optionally, on the basis of the above technical solution, the comparison executing module 230 further includes:
a response result comparison table obtaining unit, configured to obtain a response result comparison table according to the response message comparison table and the database record comparison table; and the response result comparison table comprises a response message comparison abstract and/or a database record comparison abstract.
The device can execute the comparison method of the interface test result provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. For technical details not described in detail in this embodiment, reference may be made to the method provided in any embodiment of the present invention.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention. Fig. 3 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 3 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present invention.
As shown in FIG. 3, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a memory 28, and a bus 18 that couples various system components including the memory 28 and the processing unit 16.
The memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 3, and commonly referred to as a "hard drive"). Although not shown in FIG. 3, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The processing unit 16 executes programs stored in the memory 28 to execute various functional applications and data processing, for example, to implement the interface test result comparison method provided by the embodiment of the present invention. Namely: acquiring to-be-tested data through bottom-laying data of a first testing environment according to the pre-screening conditions of each interface automatic testing case in the interface automatic testing case library, and constructing an interface request message according to the to-be-tested data and the interface automatic testing case; sending the interface request message to a first test environment and a second test environment respectively, and acquiring a response result of the interface request message in the first test environment and the second test environment; wherein, the response result comprises a response message and a database record; and according to the first test environment and the second test environment, aiming at the response result of the interface request message, acquiring an interface test comparison result of the first test environment and the second test environment.
Example four
The fourth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for comparing interface test results according to any embodiment of the present invention; the method comprises the following steps:
acquiring to-be-tested data through bottom-laying data of a first testing environment according to the pre-screening conditions of each interface automatic testing case in the interface automatic testing case library, and constructing an interface request message according to the to-be-tested data and the interface automatic testing case;
sending the interface request message to a first test environment and a second test environment respectively, and acquiring a response result of the interface request message in the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
and according to the first test environment and the second test environment, aiming at the response result of the interface request message, acquiring an interface test comparison result of the first test environment and the second test environment.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (15)
1. A method for comparing interface test results is characterized by comprising the following steps:
acquiring to-be-tested data through bottom-laying data of a first testing environment according to the pre-screening conditions of each interface automatic testing case in the interface automatic testing case library, and constructing an interface request message according to the to-be-tested data and the interface automatic testing case;
sending the interface request message to a first test environment and a second test environment respectively, and acquiring a response result of the interface request message in the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
and according to the first test environment and the second test environment, aiming at the response result of the interface request message, acquiring an interface test comparison result of the first test environment and the second test environment.
2. The method of claim 1, wherein the pre-screening criteria include account properties, account type, account balance, and/or contract status.
3. The method of claim 1, wherein constructing an interface request message according to the data to be tested and the interface automation test case comprises:
and performing inverse parameterization operation on the interface automation test case according to the data to be tested to construct an interface request message.
4. The method according to claim 1, wherein the sending the interface request packet to a first test environment and a second test environment respectively, and obtaining a response result of the interface request packet in the first test environment and the second test environment, comprises:
sending the interface request message to a first test environment, acquiring a response result of the first test environment for the interface request message, and judging whether an assertion result of the first test environment is correct or not according to the response result of the first test environment;
and if the assertion result of the first test environment is judged to be correct, sending the interface request message to a second test environment, and acquiring a response result of the second test environment for the interface request message.
5. The method of claim 4, after determining whether the assertion result of the first test environment is correct, further comprising:
and if the assertion result of the first test environment is judged to be wrong, sending a first alarm prompt.
6. The method according to claim 4 or 5, wherein after sending the interface request packet to a second test environment and obtaining a response result of the second test environment for the interface request packet, the method further comprises:
judging whether the assertion result of the second test environment is correct or not according to the response result of the second test environment;
the obtaining, according to the first test environment and the second test environment, an interface test comparison result of the first test environment and the second test environment with respect to a response result of the interface request packet includes:
and if the assertion result of the second test environment is judged to be correct, acquiring an interface test comparison result of the first test environment and the second test environment according to the first test environment and the second test environment and aiming at a response result of the interface request message.
7. The method of claim 6, after determining whether the assertion result of the second test environment is correct, further comprising:
if the assertion result of the second test environment is judged to be wrong, a second alarm prompt is sent out; wherein the alarm content of the second alarm prompt is different from the alarm content of the first alarm prompt.
8. The method according to claim 1, wherein the obtaining the response result of the first test environment and the second test environment to the interface request packet comprises:
the matching database instance is determined based on the context identifier and/or the database name.
9. The method according to claim 1, wherein the obtaining, according to the first test environment and the second test environment, an interface test comparison result of the first test environment and the second test environment with respect to a response result of the interface request packet includes:
acquiring a response message comparison table according to the response message of the first test environment aiming at the interface request message and the response message of the second test environment aiming at the interface request message;
and acquiring a database record comparison table according to the database record of the first test environment aiming at the interface request message and the database record of the second test environment aiming at the interface request message.
10. The method of claim 9, wherein the response packet comparison table comprises a first predetermined meaningless field;
and/or the data record alignment table includes a second predetermined meaningless field.
11. The method of claim 9, wherein the response packet comparison table includes entries with the same field name and the same value, entries with the same field name and different values, and entries with different field names;
and/or the data record comparison table comprises the same field name and same value item, the same field name and different value item and different field name item.
12. The method according to claim 9, wherein the obtaining, according to the first test environment and the second test environment, an interface test comparison result of the first test environment and the second test environment with respect to a response result of the interface request packet includes:
obtaining a response result comparison table according to the response message comparison table and the database record comparison table; and the response result comparison table comprises a response message comparison abstract and/or a database record comparison abstract.
13. An interface test result comparison device, comprising:
the interface request message acquisition module is used for acquiring to-be-tested data through the bottom-laying data of the first test environment according to the preposed screening conditions of the automatic test cases of each interface in the automatic test case library of the interfaces and constructing an interface request message according to the to-be-tested data and the automatic test cases of the interfaces;
a response result obtaining module, configured to send the interface request packet to a first test environment and a second test environment respectively, and obtain a response result of the interface request packet for the first test environment and the second test environment; wherein, the response result comprises a response message and a database record;
and the comparison execution module is used for acquiring the interface test comparison result of the first test environment and the second test environment according to the first test environment and the second test environment and aiming at the response result of the interface request message.
14. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of comparing interface test results of any of claims 1-12.
15. A storage medium containing computer executable instructions which when executed by a computer processor are for performing the method of interface test result alignment of any one of claims 1-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110518099.4A CN113238940B (en) | 2021-05-12 | 2021-05-12 | Interface test result comparison method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110518099.4A CN113238940B (en) | 2021-05-12 | 2021-05-12 | Interface test result comparison method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113238940A true CN113238940A (en) | 2021-08-10 |
CN113238940B CN113238940B (en) | 2023-06-02 |
Family
ID=77133645
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110518099.4A Active CN113238940B (en) | 2021-05-12 | 2021-05-12 | Interface test result comparison method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113238940B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115774990A (en) * | 2023-02-10 | 2023-03-10 | 成都萌想科技有限责任公司 | RESTful API comparison method, system, equipment and storage medium based on configuration file |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101271423A (en) * | 2008-05-19 | 2008-09-24 | 中兴通讯股份有限公司 | Software interface test approach and system |
CN110096429A (en) * | 2019-03-18 | 2019-08-06 | 深圳壹账通智能科技有限公司 | Test report generation method, device, equipment and storage medium |
CN111274154A (en) * | 2020-02-19 | 2020-06-12 | 北京蜜莱坞网络科技有限公司 | Automatic testing method, device, equipment and storage medium |
-
2021
- 2021-05-12 CN CN202110518099.4A patent/CN113238940B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101271423A (en) * | 2008-05-19 | 2008-09-24 | 中兴通讯股份有限公司 | Software interface test approach and system |
CN110096429A (en) * | 2019-03-18 | 2019-08-06 | 深圳壹账通智能科技有限公司 | Test report generation method, device, equipment and storage medium |
CN111274154A (en) * | 2020-02-19 | 2020-06-12 | 北京蜜莱坞网络科技有限公司 | Automatic testing method, device, equipment and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115774990A (en) * | 2023-02-10 | 2023-03-10 | 成都萌想科技有限责任公司 | RESTful API comparison method, system, equipment and storage medium based on configuration file |
Also Published As
Publication number | Publication date |
---|---|
CN113238940B (en) | 2023-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108763076A (en) | A kind of Software Automatic Testing Method, device, equipment and medium | |
CN108683562B (en) | Anomaly detection positioning method and device, computer equipment and storage medium | |
CN106550038B (en) | Data configuration diagnosis system and method of digital control system | |
WO2018107812A1 (en) | Error detection method and apparatus for transaction system, storage medium and computer device | |
CN108111364B (en) | Service system testing method and device | |
CN113836014A (en) | Interface testing method and device, electronic equipment and storage medium | |
US10558557B2 (en) | Computer system testing | |
CN111159040A (en) | Test data generation method, device, equipment and storage medium | |
CN110688111A (en) | Configuration method, device, server and storage medium of business process | |
CN117632710A (en) | Method, device, equipment and storage medium for generating test code | |
CN114185791A (en) | Method, device and equipment for testing data mapping file and storage medium | |
CN113238940B (en) | Interface test result comparison method, device, equipment and storage medium | |
CN111858377A (en) | Quality evaluation method and device for test script, electronic device and storage medium | |
US11954014B2 (en) | Automated unit testing in a mainframe CICS environment | |
CN116340172A (en) | Data collection method and device based on test scene and test case detection method | |
CN113392024B (en) | Method, device, equipment and medium for testing storage process | |
US11816022B2 (en) | Snapshot simulation of service module responses | |
CN113419738A (en) | Interface document generation method and device and interface management equipment | |
CN113760696A (en) | Program problem positioning method and device, electronic equipment and storage medium | |
CN111767222A (en) | Data model verification method and device, electronic equipment and storage medium | |
CN115640236B (en) | Script quality detection method and computing device | |
CN116755684B (en) | OAS Schema generation method, device, equipment and medium | |
CN111881128B (en) | Big data regression verification method and big data regression verification device | |
CN118152293A (en) | Interface testing method, device, electronic equipment and computer readable storage medium | |
CN118193390A (en) | SQL sentence detection method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |