CN112612703A - Automatic testing method and device related to multi-system interactive interface - Google Patents

Automatic testing method and device related to multi-system interactive interface Download PDF

Info

Publication number
CN112612703A
CN112612703A CN202011559860.0A CN202011559860A CN112612703A CN 112612703 A CN112612703 A CN 112612703A CN 202011559860 A CN202011559860 A CN 202011559860A CN 112612703 A CN112612703 A CN 112612703A
Authority
CN
China
Prior art keywords
data
test
correct
unit
judging whether
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011559860.0A
Other languages
Chinese (zh)
Inventor
王鹏
王乐
江洪
罗弯
楚五斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisino Corp
Original Assignee
Aisino Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisino Corp filed Critical Aisino Corp
Priority to CN202011559860.0A priority Critical patent/CN112612703A/en
Publication of CN112612703A publication Critical patent/CN112612703A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses an automatic test method and a device relating to a multi-system interactive interface, wherein the method comprises the steps of initializing databases of all associated systems through an automatic test program, and then calling a local tested system interface to perform service logic processing; the automatic test program compares the data processing results of all the associated systems with the service logic processing results of the local system interface to be tested, so that the verification work is completed, the problems are accurately positioned, the test work is developed from the aspect of the whole service process, the gap between the respective systems is broken, and the test quality of the whole link system is improved; the invention has certain universality in the field of full-flow test of data service circulation of large-scale information systems, the scheme is easy to maintain, the operation efficiency is high, the resource consumption is low, and the test vacancy in the aspect of data flow circulation among a plurality of associated systems in the full-flow test process is filled.

Description

Automatic testing method and device related to multi-system interactive interface
Technical Field
The invention relates to the field of system interface testing, in particular to an automatic testing method and device for a multi-system interactive interface.
Background
With the rapid development of information system construction, how to ensure the correctness of service data transfer between subsystems is always a difficult problem in testing for large-scale information systems. Especially, when a plurality of building units are involved to be respectively responsible for a certain subsystem, a test blind spot often exists in the aspect of service data circulation among a plurality of subsystem interfaces.
The invention aims to get through the interfaces among a plurality of subsystems and the service data storage, firstly initialize the database of each subsystem through an automatic test program, and then call the local tested system interface to carry out service logic processing; finally, the automatic test program compares the data processing results of the subsystems and the service logic processing results of the local tested system interface to complete the verification work and accurately position the problem.
The invention is convenient for testers to use, has certain universality in the field of full-flow test of data service circulation of large-scale information systems, is easy to maintain, has high operation efficiency and small resource consumption, and fills the test vacancy in the aspect of data flow circulation among a plurality of subsystems in the full-flow test process.
Disclosure of Invention
In order to solve the problem that the prior art in the background art is difficult to realize the automatic full-flow test on a multi-system interactive interface, the invention provides an automatic test method relating to the multi-system interactive interface, which comprises the following steps:
initializing each associated system database, and inserting original test data into the first system;
the system I receives and processes an original test data request, and stores the processed original test data into a system II;
judging whether the processed original test data stored in the system II is correct or not; if not, returning the failure information of the second system data storage, and ending the test;
if yes, calling a local interface to send a service request to a second system and receiving a return result;
judging whether the processing result of the local interface is correct or not according to the returned result; if not, returning the failure information of the interface processing data, and ending the test;
if yes, judging whether the data state updating of the system II is correct or not; if not, returning data updating failure information of the second system, and ending the test;
if yes, returning successful verification information of the data processing and local interface processing functions of the system II, and completing the whole-flow test.
Further, before initializing each associated system database, the method further includes: raw test data is prepared.
Further, the preparing the raw test data includes:
preparing original test data of relevant table names and non-empty fields of the database in the system.
Further, each of the associated system databases includes Oracle and MySQL.
Further, initializing each associated system database and inserting original test data into the system I, judging whether the processed original test data stored in the system II is correct or not, calling a local interface to send a service request to the system II, judging whether a processing result of the local interface is correct or not and judging whether the state update of the system II data is correct or not are completed by adopting an automatic test program.
An automated testing apparatus involving multi-system interaction interfaces, the apparatus comprising:
one end of the initialization unit is connected with the judgment test unit; the initialization unit is used for initializing each associated system database and inserting original test data into the first system;
the judgment and test unit is connected with the output unit at one end; the judgment test unit is used for judging whether the original test data which is received by the first system and processed and then stored in the second system is correct or not; the judging and testing unit is also used for calling the local interface to send a service request to the second system and receiving a return result; the judging and testing unit is also used for judging whether the processing result of the local interface is correct or not according to the returned result; the judging and testing unit is also used for judging whether the updating of the second data state of the system is correct or not; the judgment test unit is also used for sending the judgment information to the output unit;
and the output unit is used for outputting the test result to a user according to the judgment information.
Further, the apparatus further comprises:
the data preparation unit is connected with the initialization unit at one end; the data preparation unit is used for preparing original test data of related table names and non-empty fields of a database of the system and sending the original test data to the initialization unit.
Further, each of the associated system databases includes Oracle and MySQL.
Further, initializing each associated system database and inserting original test data into the system I, judging whether the processed original test data stored in the system II is correct or not, calling a local interface to send a service request to the system II, judging whether a processing result of the local interface is correct or not and judging whether the state update of the system II data is correct or not are completed by adopting an automatic test program.
The invention has the beneficial effects that: the technical scheme of the invention provides an automatic testing method and a device relating to a multi-system interactive interface, wherein the method comprises the steps of initializing a database of each associated system through an automatic testing program, and then calling a local tested system interface to perform service logic processing; the automatic test program compares the data processing results of all the associated systems with the service logic processing results of the local system interface to be tested, so that the verification work is completed, the problems are accurately positioned, the test work is developed from the aspect of the whole service process, the gap between the respective systems is broken, and the test quality of the whole link system is improved; the invention has certain universality in the field of full-flow test of data service circulation of large-scale information systems, the scheme is easy to maintain, the operation efficiency is high, the resource consumption is low, and the test vacancy in the aspect of data flow circulation among a plurality of associated systems in the full-flow test process is filled.
Drawings
A more complete understanding of exemplary embodiments of the present invention may be had by reference to the following drawings in which:
FIG. 1 is a flow chart of an automated testing method involving multiple system interaction interfaces according to an embodiment of the present invention;
fig. 2 is a structural diagram of an automatic testing apparatus relating to a multi-system interactive interface according to an embodiment of the present invention.
Detailed Description
The exemplary embodiments of the present invention will now be described with reference to the accompanying drawings, however, the present invention may be embodied in many different forms and is not limited to the embodiments described herein, which are provided for complete and complete disclosure of the present invention and to fully convey the scope of the present invention to those skilled in the art. The terminology used in the exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting of the invention. In the drawings, the same units/elements are denoted by the same reference numerals.
Unless otherwise defined, terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Further, it will be understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense.
Fig. 1 is a flowchart of an automated testing method involving a multi-system interactive interface according to an embodiment of the present invention. As shown in fig. 1, the method includes:
step 110, preparing original test data; specifically, in order to reduce the test difficulty of the full-flow service data, the test is performed around the local interface in this example; starting with the data preparation of the upstream system (system) of the local interface, codes are written by using a python language, the preparation link of original test data is simplified, and a tester only needs to prepare the original data of table names and non-empty fields related to the database of the upstream system (system), and can insert various original test data of different test scenes into the database of the upstream system (system) of the local interface through a program, so that the process of preparing the original test data is realized.
Step 120, initializing each associated system database, and inserting the original test data prepared in step 110 into an upstream system (system); specifically, the upstream system (system) service data, the normal flow is generated in the upstream system (system), and we can directly provide the original test data of the upstream system (system) for the local interface by a way of creating data in the database of the upstream system (system);
further, each associated system database comprises Oracle and MySQL; the initialization of each associated system database and the insertion of the original test data into the upstream system (system) is accomplished by using an automated test program.
Step 130, the upstream system (system one) receives and processes the original test data request, and stores the processed original test data into the downstream system (system two).
Step 140, judging whether the processed original test data stored in a downstream system (system two) is correct or not by adopting an automatic test program; if not, returning data storage failure information of a downstream system (system two), and ending the test; if yes, go to step 150.
Step 150, the automated test program calls the local interface to send a service request to the downstream system (system two), and receives a return result.
Step 160, judging whether the processing result of the local interface is correct or not according to the returned result; if not, returning the failure information of the interface processing data, and ending the test; if yes, go to step 170.
Step 170, the automation program determines whether the data status update of the downstream system (system two) is correct; if not, returning data updating failure information of a downstream system (system two), and ending the test; if yes, go to step 180.
And step 180, returning successful verification information of the data processing and local interface processing functions of the downstream system (system II), and completing the full-flow test.
Initializing a database of each associated system through an automatic test program, and then calling a local tested system interface to perform service logic processing; the automatic test program compares the data processing results of all the associated systems with the service logic processing results of the local system interface to be tested, so that the verification work is completed, the problems are accurately positioned, the test work is developed from the aspect of the whole service process, the gap between the respective systems is broken, and the test quality of the whole link system is improved; the method has certain universality in the full-process testing field of data service circulation of a large-scale information system, is easy to maintain, has high operation efficiency and low resource consumption, and fills the test vacancy in the aspect of data flow circulation among a plurality of associated systems in the full-process testing process.
The method is explored and practiced in the value-added tax comprehensive service platform, is easy to use, configure and return, improves the working efficiency, has reference and guidance significance for the whole-process business data process testing work of other large-scale information system projects, and has high use value.
Fig. 2 is a structural diagram of an automatic testing apparatus relating to a multi-system interactive interface according to an embodiment of the present invention. As shown in fig. 2, the apparatus includes:
a data preparation unit 110, wherein one end of the data preparation unit 110 is connected with the initialization unit 120; the data preparing unit 110 is configured to prepare original test data of a table name and a non-empty field associated with a database of the system, and send the original test data to the initializing unit 120.
Specifically, in order to reduce the test difficulty of the full-flow service data, the test is performed around the local interface in this example; starting with the data preparation of the upstream system (system) of the local interface, codes are written by using a python language, the preparation link of original test data is simplified, and a tester only needs to prepare the original data of table names and non-empty fields related to the database of the upstream system (system), and can insert various original test data of different test scenes into the database of the upstream system (system) of the local interface through a program, so that the process of preparing the original test data is realized.
An initialization unit 120, one end of the initialization unit 120 is connected to the judgment test unit 130; the initialization unit 120 is configured to initialize each associated system database, and insert original test data into an upstream system (system);
further, each associated system database comprises Oracle and MySQL; the initialization of each associated system database and the insertion of the original test data into the upstream system (system) is accomplished by using an automated test program.
A judgment test unit 130, wherein one end of the judgment test unit 130 is connected with the output unit 140; the judgment and test unit 130 is configured to judge whether original test data, which is received and processed by an upstream system (system one) and then stored in a downstream system (system two), is correct; the judgment and test unit 130 is further configured to invoke a local interface to send a service request to a downstream system (system two), and receive a return result; the judging and testing unit 130 is further configured to judge whether a local interface processing result is correct according to the returned result; the judging and testing unit 130 is further configured to judge whether the data status update of the downstream system (system two) is correct; the judgment test unit 130 is further configured to send the judgment information to the output unit 140;
further, the judging whether the processed original test data stored in the downstream system (system two) is correct, the calling of the local interface to send a service request to the downstream system (system two), the judging whether the processing result of the local interface is correct and the judging whether the data state update of the downstream system (system two) is correct are completed by adopting an automatic test program.
An output unit 140, wherein the output unit 140 is configured to output a test result to a user according to the determination information.
Initializing databases of all associated systems through an automatic test program, and then calling a local tested system interface to perform service logic processing; the automatic test program compares the data processing results of all the associated systems with the service logic processing results of the local system interface to be tested, so that the verification work is completed, the problems are accurately positioned, the test work is developed from the aspect of the whole service process, the gap between the respective systems is broken, and the test quality of the whole link system is improved; the device has certain universality in the field of full-flow test of data service circulation of a large-scale information system, is easy to maintain, has high operation efficiency and low resource consumption, and fills the test vacancy in the aspect of data flow circulation among a plurality of associated systems in the full-flow test process.
The device is explored and practiced in the value-added tax comprehensive service platform, is easy to use, configure and return, improves the working efficiency, has reference and guidance significance for the whole-process business data process testing work of other large-scale information system projects, and has high use value.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the disclosure may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Reference to step numbers in this specification is only for distinguishing between steps and is not intended to limit the temporal or logical relationship between steps, which includes all possible scenarios unless the context clearly dictates otherwise.
Moreover, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the disclosure and form different embodiments. For example, any of the embodiments claimed in the claims can be used in any combination.
Various component embodiments of the disclosure may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. The present disclosure may also be embodied as device or system programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present disclosure may be stored on a computer-readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the disclosure, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The disclosure may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several systems, several of these systems may be embodied by one and the same item of hardware.
The foregoing is directed to embodiments of the present disclosure, and it is noted that numerous improvements, modifications, and variations may be made by those skilled in the art without departing from the spirit of the disclosure, and that such improvements, modifications, and variations are considered to be within the scope of the present disclosure.

Claims (9)

1. An automated testing method involving multi-system interactive interfaces, the method comprising:
initializing each associated system database, and inserting original test data into the first system;
the system I receives and processes an original test data request, and stores the processed original test data into a system II;
judging whether the processed original test data stored in the system II is correct or not; if not, returning the failure information of the second system data storage, and ending the test;
if yes, calling a local interface to send a service request to a second system and receiving a return result;
judging whether the processing result of the local interface is correct or not according to the returned result; if not, returning the failure information of the interface processing data, and ending the test;
if yes, judging whether the data state updating of the system II is correct or not; if not, returning data updating failure information of the second system, and ending the test;
if yes, returning successful verification information of the data processing and local interface processing functions of the system II, and completing the whole-flow test.
2. The method of claim 1, further comprising, prior to said initializing each associated system database: raw test data is prepared.
3. The method of claim 2, wherein the preparing raw test data comprises:
preparing original test data of relevant table names and non-empty fields of the database in the system.
4. The method of claim 1, wherein: the associated system databases comprise Oracle and MySQL.
5. The method of claim 1, wherein: initializing each associated system database and inserting original test data into a system I, judging whether the processed original test data stored in a system II is correct or not, calling a local interface to send a service request to the system II, judging whether a local interface processing result is correct or not and judging whether the state update of the system II data is correct or not are completed by adopting an automatic test program.
6. An automated testing apparatus relating to multi-system interactive interfaces, the apparatus comprising:
one end of the initialization unit is connected with the judgment test unit; the initialization unit is used for initializing each associated system database and inserting original test data into the first system;
the judgment and test unit is connected with the output unit at one end; the judgment test unit is used for judging whether the original test data which is received by the first system and processed and then stored in the second system is correct or not; the judging and testing unit is also used for calling the local interface to send a service request to the second system and receiving a return result; the judging and testing unit is also used for judging whether the processing result of the local interface is correct or not according to the returned result; the judging and testing unit is also used for judging whether the updating of the second data state of the system is correct or not; the judgment test unit is also used for sending the judgment information to the output unit;
and the output unit is used for outputting the test result to a user according to the judgment information.
7. The apparatus of claim 6, further comprising:
the data preparation unit is connected with the initialization unit at one end; the data preparation unit is used for preparing original test data of related table names and non-empty fields of a database of the system and sending the original test data to the initialization unit.
8. The apparatus of claim 6, wherein: the associated system databases comprise Oracle and MySQL.
9. The apparatus of claim 6, wherein: initializing each associated system database and inserting original test data into a system I, judging whether the processed original test data stored in a system II is correct or not, calling a local interface to send a service request to the system II, judging whether a local interface processing result is correct or not and judging whether the state update of the system II data is correct or not are completed by adopting an automatic test program.
CN202011559860.0A 2020-12-25 2020-12-25 Automatic testing method and device related to multi-system interactive interface Pending CN112612703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011559860.0A CN112612703A (en) 2020-12-25 2020-12-25 Automatic testing method and device related to multi-system interactive interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011559860.0A CN112612703A (en) 2020-12-25 2020-12-25 Automatic testing method and device related to multi-system interactive interface

Publications (1)

Publication Number Publication Date
CN112612703A true CN112612703A (en) 2021-04-06

Family

ID=75245600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011559860.0A Pending CN112612703A (en) 2020-12-25 2020-12-25 Automatic testing method and device related to multi-system interactive interface

Country Status (1)

Country Link
CN (1) CN112612703A (en)

Similar Documents

Publication Publication Date Title
CN110716870B (en) Automatic service testing method and device
CN110704323B (en) Automatic data testing method and device based on rail transit line data
CN111897724A (en) Automatic testing method and device suitable for cloud platform
CN109977012B (en) Joint debugging test method, device, equipment and computer readable storage medium of system
CN107193730A (en) A kind of interface test method of automation
CN112650676A (en) Software testing method, device, equipment and storage medium
CN117076330B (en) Access verification method, system, electronic equipment and readable storage medium
CN105550103A (en) Custom test script based automated testing method
CN116090380B (en) Automatic method and device for verifying digital integrated circuit, storage medium and terminal
CN112612703A (en) Automatic testing method and device related to multi-system interactive interface
CN111427582A (en) Management method, device and equipment of RT L code and computer readable storage medium
CN116431522A (en) Automatic test method and system for low-code object storage gateway
CN115934559A (en) Testing method of intelligent form testing system
CN111221727A (en) Test method, test device, electronic equipment and computer readable medium
CN113238940B (en) Interface test result comparison method, device, equipment and storage medium
CN112463633B (en) Method, device, equipment and medium for checking address decoding of on-chip memory
CN113535499A (en) Multi-type concurrent memory access flow verification method supporting multi-core shared access
CN113792522A (en) Simulation verification method and device and computing equipment
CN108804309B (en) Automatic test method and test tool for contract management system
CN113806153A (en) Chip verification method
CN106909511A (en) A kind of automated testing method based on RedwoodHQ
CN113341767A (en) Method, system and computer readable storage medium for automated testing
CN112380118A (en) Unit testing method, unit testing device, medium and electronic equipment
CN112308622A (en) Data processing method and device for virtual object, storage medium and electronic equipment
CN113407408B (en) Data transmission rule verification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination