CN108540351B - Automatic testing method for distributed big data service - Google Patents

Automatic testing method for distributed big data service Download PDF

Info

Publication number
CN108540351B
CN108540351B CN201810362027.3A CN201810362027A CN108540351B CN 108540351 B CN108540351 B CN 108540351B CN 201810362027 A CN201810362027 A CN 201810362027A CN 108540351 B CN108540351 B CN 108540351B
Authority
CN
China
Prior art keywords
interface
data
test
tag
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810362027.3A
Other languages
Chinese (zh)
Other versions
CN108540351A (en
Inventor
马春燕
李尚儒
王慧朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201810362027.3A priority Critical patent/CN108540351B/en
Publication of CN108540351A publication Critical patent/CN108540351A/en
Application granted granted Critical
Publication of CN108540351B publication Critical patent/CN108540351B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/02Standardisation; Integration
    • H04L41/0246Exchanging or transporting network management information using the Internet; Embedding network management web servers in network elements; Web-services-based protocols
    • H04L41/0266Exchanging or transporting network management information using the Internet; Embedding network management web servers in network elements; Web-services-based protocols using meta-data, objects or commands for formatting management information, e.g. using eXtensible markup language [XML]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/34Network arrangements or protocols for supporting network services or applications involving the movement of software or configuration parameters 

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides an automatic testing method of distributed big data service, which adopts XML Schema to describe a testing interface, provides corresponding specifications and can effectively describe API of a web application or a cloud platform. By describing XML of API, the invention can automatically generate test sequence, test script and test data. According to the invention, a user can automatically construct a test sequence and generate test data only by describing the API of the web application or the cloud platform, so that the purpose of automatic test is achieved, the workload of the user is greatly reduced, the test efficiency and accuracy are improved, and the test cost is reduced. Meanwhile, the test work which is difficult to complete by manual tests such as pressure tests and the like can be performed on the web application or the cloud platform.

Description

Automatic testing method for distributed big data service
Technical Field
The invention relates to the technical field of automatic testing of distributed big data services, in particular to an automatic testing method of distributed big data services.
Background
The internet has grown at an alarming rate, and web applications have penetrated various aspects of people's lives. Currently, cloud computing platforms (e.g., amazon, Azure, Oracle, and huaya) provide a relatively stable Application Programming Interface (API) for various distributed services, including infrastructure services, storage services, and data services.
Cloud computing attracts more and more developers to migrate their Web applications to cloud platforms. Under the cloud concept, the development of Web applications is different from that of traditional small and medium single-machine applications, and Web application programs are becoming distributed application programs of multi-user terminals (such as a PC terminal, an Android terminal, an IOS terminal, a wechat terminal and the like) with large data storage, a large server background and cross-platforms, so that stable application interfaces (APIs) are gradually provided for the Web application control layer to adapt to different user terminal platforms, and cross-platforms of the control layer and background services are realized.
The cloud computing platform or Web application control layer expands applications and services around data storage and access, most of the APIs provided by the cloud computing platform or the Web application control layer to clients are centered on data services, the APIs are called data service APIs, and for example, in a Web travel information system, the API application programming interfaces for adding, modifying, searching and deleting a plurality of products are provided by the product service of the control layer; in a cloud computing platform providing platform and application level services, a web conferencing service provides API application programming interfaces for adding, modifying, finding and deleting conferences. In general, the data service APIs for each data entity can be divided into 4 types: add data, find data, modify data, and delete data.
With the advent and wide application of data service APIs of cloud computing platforms and Web application control layers, a great deal of routine, repeated and distributed concurrent testing needs to be carried out on schedule step by step, for example, in the process of integration, maintenance and evolution upgrading of cloud computing platforms and Web applications, the data service APIs of the cloud computing platforms and the Web applications cannot be modified, but the testing of the APIs needs to be carried out repeatedly. Due to the reasons of large number of data service APIs, huge application systems, various terminal systems, frequent service deployment updating and iteration and the like, manual and manual testing of cloud computing platforms and Web applications is heavy in burden, low in efficiency and high in cost, and even the expected target of distributed testing cannot be completed, so that automatic testing of distributed big data service APIs becomes more and more important. At present, experts, scholars and enterprise technicians at home and abroad have achieved some achievements in software automation test, such as automatic execution of test scripts and test data generation of a single API of a single service, but the following challenges still exist:
1. API test data sequence and test prediction for single or multiple service applications cannot be automatically generated;
2. the test script cannot be automatically generated.
Disclosure of Invention
Because the API interface of the data service is stable and can be divided into 4 types of operations of adding, modifying, searching and deleting, meanwhile, the business logic facing the data service is not complex, the repeatability is high, but the test quantity is large, the distributed data service has the condition of automatic test. The invention provides an automatic testing method of distributed big data service aiming at the characteristics of distributed big data service API, which comprises the steps of generating a test interface description specification example and generating a test sequence, test data, a test language and a test script based on the test interface description specification.
By the invention, a user can realize the automatic generation of a single service API test data sequence and test prediction and the automatic generation of a plurality of service API test data sequences and test predictions with association. The test data and the test prediction can be reused numerous times in the integration and evolution upgrading of the cloud computing platform and the Web application, and according to the API test sequences and the data, the invention also provides a method for automatically generating the test script, and the data service can be subjected to distribution concurrent test according to the test script. The method can be applied to the development of cloud computing platforms, Web systems and other data service systems, and carries out automatic test on data services under the condition of less manual intervention so as to continuously promote the improvement of the quality control level of the systems in integration and evolution.
The technical scheme of the invention is as follows:
the automatic testing method of the distributed big data service is characterized in that: the method comprises the following steps:
step 1: establishing an example based on an XML Schema test interface description specification; the description specification of the test interface describes interface information through XML tags, and the content comprises a service information tag, an operation tag, an interface tag, a request tag, a response tag, a dependency tag and a data tag and a constraint type tag of the data tag; the 7 data types supported by the description specification are respectively 3 data types supported by XML Schema: the data type comprises a built-in data type, a simple data type, a complex data type and 4 additional data types, wherein the data types are respectively a file data type, an image data type, an audio data type and a video data type;
step 2: taking the example document based on the description specification of the XML Schema test interface obtained in the step 1 as an input, respectively analyzing each document, and storing information in the description document in the following variables in the analyzing process:
variable operatertypemapp: the type is a key-value set, the key value is the identification id of the service instance, and the value is an operation tag contained in the service instance; this variable is used to store all operation types of the service;
variables addlnterfacesetmap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap: the type is a key-value set, the key value is the identification id of the service instance, and the value is an interface set; the elements in the interface set are: an interface identifier id, a parameter list of request data and response data; each parameter in the parameter list comprises two attributes of a parameter type and a parameter name; the variable is used for storing specific operation instances in various operation types;
the variable ParameterConstrainsMap: the type is a key-value set, the key value is an identification id of the service instance, and the value is a constraint set of the interface; the elements in the constraint set are: interface identification id, parameter constraint list; the parameter constraint list comprises parameter names and constraint conditions; the variable is used for storing the constraint of the parameter of each specific operation;
variables Premises: the type is a set, and the element value in the set is the service identification id of the precursor service of the current service instance;
and step 3: automatically generating an operational test pattern:
taking the OperaterTypeMap obtained in the step 2 as input, and combining the operations contained in the OperaterTypeMap variable according to the identification id of the service instance and the value in the OperaterTypeMap to obtain a regular expression set which accords with the use habit of the user and is used as a corresponding operation test mode of the service instance;
and 4, step 4: automatically generating an API test sequence and a test script based on the operation test mode:
generating an actual operation sequence according to the operation test mode obtained in the step 3; deleting the operation sequences which do not meet the requirements according to the constraint rules; the constraint rule is as follows:
in an operation sequence, the add number before update should be larger than the delete number, otherwise, the operation sequence is deleted; in an operation sequence, the add number should be larger than the delete number, otherwise, the operation sequence is deleted;
after the operation sequence is determined, arranging API interface sets under all operation labels in the operation sequence according to the operation sequence to obtain an API test sequence;
writing each API test sequence into a test script respectively;
and 5: automatically generating test data:
and (3) taking the test script obtained in the step (4) and the parameter ParameteconstrainMap obtained in the step (2) as input, and outputting test data of each interface, wherein the specific steps are as follows:
step 5.1: generating a static table, and generating columns 1 and 2 of the dynamic table according to the test script obtained in the step 4;
the static table is used for enumerating operation types which are correspondingly depended by the operation types; wherein the operation types on which find depends are add, update, delete; the operation types of update dependence are add and find; the operation types of delete dependence are add, find, update;
the dynamic table comprises 5 columns, wherein the 1 st column represents the sequential position of the interface in the script, the 2 nd column represents the type of the interface, the 3 rd column represents the object identification and the related object attribute in the test input data, the 4 th column represents the output of the interface, the 5 th column represents the test prediction of the interface, and the expected output of the interface is recorded;
step 5.2: and iterating the dynamic table according to the interface sequence in the test script, wherein the iteration rule is as follows:
step a: starting from a first interface, generating data in a dynamic table corresponding to the interface one by one until all data are generated; judging the current interface IiIf the operation type is add, then step b is carried out, otherwise step c is carried out;
step b: current interface IiGenerating test data according to a black box test principle, generating a unique key for each data set, filling the unique key in a 3 rd column in a dynamic table, filling an interface identifier in a 4 th column, and leaving a 5 th column as null; turning to the step a after the completion to generate the data of the next interface;
step c: current interface IiIs not add, the input data of the interface depends on the previous interface IjInput data of j<i, the input data dependency rule is:
following fromOn the principle of proximity, priority is given to interface IiNearest interface Ij(ii) a Interface I is judged by comparing static tablesjIs the interface I or notiCan be relied upon; if not, traversing forwards until a dependable interface is found; if dependent, compare interface IiInput parameters and interface IjIf the interface I is an input parameter or a return parameterjCan satisfy interface IiThe requirement of the input parameter is interface IjAnd interface IiEstablishing data dependency relationship, and connecting interface IjIs filled into interface IiDynamic Table, column 3, with interface IiActual output fill interface IiDynamic Table column 4, expected results fill interface IiB, column 5 of the dynamic table, turning to step a after the dynamic table is completed, and generating data of a next interface; if the interface IjIf the input parameter or the return parameter does not meet the condition, the process goes forward until the next dependent interface is found;
wherein, if the current interface IiIs of operation type delete and the input data is identified as id1Need to connect interface IiAll previous data are identified as id1The data of the interface is empty; if the current interface IiIs update and the input data identification is id1Need to connect interface IiAll previous input data are identified as id1The corresponding data of the interface is changed into the latest data;
step d: finally, test data in the JSON format is generated.
Further, in a preferred embodiment, the automated testing method for distributed big data services is characterized in that:
the XML tag in the step 1 is specifically divided into:
service information tag < resources >: describing the content as service information; the attributes comprise a service identifier id, a service name, a domain name base and a precursor service identifier premise; the priority of the service information label is 0, and the service information label is the root of all the labels;
an operation tag, including < add > < delete > < update > < find >: describing the operation of adding, deleting, updating and searching; the priority is 1; < add > behavior of operation: adding a number of parameters to the service; < delete > behavior of operation: deleting all parameters added by the corresponding < add > operation; action of < update > operation: modifying some or all of the parameters of the corresponding < add >; < find > operation behavior: querying one or more parameter information in the stored service;
interface tag < resource >: information describing the content as an interface; the attribute comprises an interface identifier id, an interface name and a requested actual address path; the priority is 2;
request tag < request > and response tag < response >: describing the user's request and response; the attribute comprises a data type dataType; the priority is 3;
dependency tag < dependency >: describing the operational interface on which the service depends; the attribute comprises id of service corresponding to the parameter name resources and id of interface corresponding to the resource;
data tag < param >: information describing the request parameters; the attribute comprises a parameter name, a corresponding attribute name, a parameter type and whether the parameter is required or not; the priority is 4.
Data tag < data >: describing data from the response message; the attribute comprises name data name and type data type; the priority is 4;
data tag < element >: a description container type element; the attribute comprises a data name and a data type; the priority is 5; and under < param >, < data > tags or self-contained;
constraint type tag of data tag < restriction >: describing a constraint condition; under the < param >, < data >, < element > tags.
Further, in a preferred embodiment, the automated testing method for distributed big data services is characterized in that:
the test script in step 4 includes 4 tags and the following attribute values:
the < script > tag contains an attribute resourcesID, and the attribute value is the value of the corresponding service identification id; < script > tag priority is 0;
the < step > tag comprises a path attribute and an operation attribute, wherein the path attribute is a url path of a corresponding interface, and the operation attribute is a corresponding interface operation type; < step > tag priority 1;
the < param > tag comprises three attributes of name, attribute and value, wherein the attribute name records the name of the parameter, the attribute records the corresponding attribute of the parameter in the class to which the attribute belongs, and the attribute value records the corresponding expected result; < param > tag priority 2;
< response > tag, no attribute, priority 1.
Further, in a preferred embodiment, the automated testing method for distributed big data services is characterized in that: in step 3, the OperaterTypeMap obtained in step 2 is used as input, and the operations contained in the OperaterTypeMap variable are combined according to the identification id of the service instance and the value in the OperaterTypeMap by combining 4 operation combination conditions to obtain a regular expression set which accords with the use habit of a user and is used as an operation test mode corresponding to the service instance;
the 4 operation combination cases are as follows: the service only has two operations of add and find; the service has three operations of add, delete and find; the service has three operations of add, update and find; the service has four operations of add, update, delete and find;
the test mode of operation for each combination of operations is:
when the service only has two operations of add and find, the operation mode is add (add | find) × find;
when the service has three operations of add, delete and find, the operation modes are add (add | delete | find) find and add (add | find) find;
when the service has three operations of add, update and find, the operation modes are add (add | update | find) × find and add (add | find) × find;
when the service has four operations of add, update, delete and find, the operation modes are add (add | update | delete | find) } find, add (add | update | find) } find and add (add | find) } find.
Further, in a preferred embodiment, the automated testing method for distributed big data services is characterized in that: when the actual operation sequence is generated according to the operation test mode in step 4, the value of the number in the operation test mode is determined according to the user requirement, and if the user does not have the requirement, the specific value of the number is (the number of service interfaces included in add operation + the number of service interfaces included in update operation + the number of service interfaces included in delete operation) × 2-2.
Further, in a preferred embodiment, the automated testing method for distributed big data services is characterized in that: the API test sequence generation rule in the step 4 is as follows: taking the Cartesian product of the API interface set corresponding to each operation label in the operation sequence as a set of the API test sequence, and sequencing the APIs in the API test sequence according to corresponding positions in the operation sequence; the operating API interface set is derived from the variables AddInterfaceSetMap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap in step 2.
Advantageous effects
The invention adopts XML Schema to describe the test interface, provides corresponding specifications and can effectively describe the API of the web application or the cloud platform. By describing XML of API, the invention can automatically generate test sequence, test script and test data.
According to the invention, a user can automatically construct a test sequence and generate test data only by describing the API of the web application or the cloud platform, so that the purpose of automatic test is achieved, the workload of the user is greatly reduced, the test efficiency and accuracy are improved, and the test cost is reduced. Meanwhile, the test work which is difficult to complete by manual tests such as pressure tests and the like can be performed on the web application or the cloud platform.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Detailed Description
The following detailed description of embodiments of the invention is intended to be illustrative, and not to be construed as limiting the invention.
The invention solves the problem of automatic generation of a data service API test data sequence, a test prediction and a test script, and comprises the following contents and technologies:
1) an XML schema is adopted to define a data service test interface description specification. The data service test interface description specification comprises basic information of the service and service names depended by the basic information, API set of the service, operation types of each API, information of required input parameters and return data information.
2) The method aims at adding, modifying, searching and deleting 4 types of operation types of data services, provides a test model based on a test interface description example and a corresponding test mode, and can automatically analyze the test interface description example to generate the test mode described by a regular expression.
3) And based on the test mode, automatically generating a test sequence, test data, a test statement and a test script of the data service API according to a related algorithm.
The specific technical scheme is as follows:
step 1: establishing an example based on an XML Schema test interface description specification;
the invention defines the description specification of the test interface; the description specification of the test interface describes interface information through XML tags, and the content comprises a service information tag, an operation tag, an interface tag, a request tag, a response tag, a dependency tag and a data tag and a constraint type tag of the data tag; the 7 data types supported by the description specification are respectively 3 data types supported by XML Schema: the data type comprises a built-in data type, a simple data type, a complex data type and 4 additional data types, wherein the data types are respectively a file data type, an image data type, an audio data type and a video data type;
the XML tags in the description specification of the test interface based on the XML Schema are specifically divided into:
service information tag < resources >: describing the content as service information; the attributes comprise a service identification id, a service name, a domain name base, a precursor service identification premix (indicating that the service depends on the service with the service identification being a premix value; the allowed value is null, indicating that the service does not depend on any service); the priority of the service information label is 0, and the service information label is the root of all the labels; for example: "Product _ Service" name ═ Product controller "base ═ http://" premise ═ ">".
An operation tag, including < add > < delete > < update > < find >: the description contents are operations of adding, deleting, updating and searching, and each API of the data service is classified into one of 4 operation types of < add >, < delete >, < update > and < find >; the operation label priority is 1; < add > behavior of operation: adding a number of parameters to the service; < delete > behavior of operation: deleting all parameters added by the corresponding < add > operation; action of < update > operation: modifying some or all of the parameters of the corresponding < add >; < find > operation behavior: one or more parameter information in the stored service is queried.
Interface tag < resource >: information describing the content as an interface; the attribute comprises an interface identifier id, an interface name and a requested actual address path; the priority is 2. For example: the method comprises the steps of (resource id) ("Product _ New" ("name) (" addProduct "(" path) ("addProduct") > … </resource >).
Request tag < request > and response tag < response >: describing the user's request and response; the attribute comprises a data type dataType; the priority is 3. For example: < response dataType ═ JSON "> … </response >.
Dependency tag < dependency >: describing the operational interface on which the service depends; the attribute comprises id of service corresponding to the parameter name resources and id of interface corresponding to the resource; (if one service A depends on another service B, the service B can be modified or deleted to affect the service A, so that the clear dependency relationship is expressed). For example: the term "Product _ Service" means that the Service depends on an interface identified as Product _ New among services identified as Product _ Service.
Data tag < param >: information describing the request parameters; the attribute comprises a parameter name (name of the request parameter), a corresponding attribute name attribute (attribute in a class to which the parameter belongs), a parameter type (true/false), and whether the description parameter is required; the priority is 4. For example: the "feature" attribute "type" product.feature "type" String "/>.
Data tag < data >: describing data from the response message; the attribute comprises name data name and type data type; the priority is 4; for example: < data name ═ userName "type ═ String"/>.
Data tag < element >: a description container type element; the attribute comprises a data name and a data type; the priority is 5; and under < param >, < data > tags or self-contained; for example: < element name ═ productInfo ═ type ═ String "/>.
Constraint type tag of data tag < restriction >: describing a constraint condition; under the < param >, < data >, < element > tags.
And establishing an example of the XML Schema-based test interface description specification of the service based on the interface description specification. For example, some travel information system product service instances include:
1) the Service identifier is "Product _ Service", the Service name is "Product controller", the domain name is "http://", no Service precursor;
2) there are 7 API interfaces, 4 operation types
a) The add operation type contains 2 API interfaces, namely addProduct and addProduct BasiciInformation.
b) The delete operation type contains 1 API interface, namely, deleteProduct.
c) The update operation type contains 2 API interfaces, editProduceBiscInformation and editProduceBisc.
d) The find operation type contains 2 API interfaces, namely displayproducts and showProduct detail.
Assume the add operation type, where the interface id is Product _ New, the name is add Product, and the address is add Product, and assume that there is a parameter in its request, the name of the corresponding attribute is Product. From this, the following example can be obtained.
Figure BDA0001636204190000101
Step 2: preprocessing the test interface description example:
taking the example document based on the description specification of the XML Schema test interface obtained in the step 1 as an input, respectively analyzing each document, and storing information in the description document in the following variables in the analyzing process:
variable operatertypemapp: the type is a key-value set, the key value is the identification id of the service instance, and the value is an operation tag contained in the service instance. The operation tag specifically includes: < add >, < delete >, < update >, < find >; this variable is used to store all operation types of the service.
Variables addlnterfacesetmap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap: the type is a key-value set, the key value is the identification id of the service instance, and the value is an interface set. The elements in the interface set are: an interface identifier id, a parameter list of request data and response data; each parameter in the parameter list comprises two attributes of a parameter type and a parameter name; the variable is used for storing specific operation instances in various operation types;
the variable ParameterConstrainsMap: the type is a key-value set, the key value is the identification id of the service instance, and the value is a constraint set of the interface. The elements in the constraint set are: interface identification id, parameter constraint list; the parameter constraint list comprises parameter names and constraint conditions; the variable is used for storing the constraint of the parameter of each specific operation;
variables Premises: the type is a set, and the element value in the set is the service identification id of the precursor service of the current service instance.
And step 3: automatically generating an operational test pattern:
and (3) taking the OperaterTypeMap obtained in the step (2) as an input, and combining the operations contained in the OperaterTypeMap variable according to the identification id of the service instance and the value in the OperaterTypeMap to obtain a regular expression set which accords with the use habit of the user and is used as a corresponding operation test mode of the service instance.
After analyzing the operation types of a large number of data service APIs, two characteristics are obtained: 1. if one service does not have find, the other service is relied on by the user and provides the search for the service; 2. services without add operations are meaningless. The present invention therefore proposes a combination of 4 operation types in table 1, where each operation type may correspond to multiple APIs for a service.
TABLE 1 combinations of operations
add delete Update Find
1 1 1 1 1
2 1 1 1
3 1 1 1
4 1 1
According to the method, for each situation, after the operation type and the operation sequence are determined by combining the use habits of users, the operation test mode is represented in a regular expression mode. The principle of the present invention for operating the test mode is as follows:
1) considering the expected result, under the advance of the operations including add and find, an add (add | find) and find mode is proposed;
2) in order to ensure that the fault position can be accurately positioned, on the basis of 1), add up or delete operation, and propose add (add | delete | find) } find or add (add | update | find) } find mode
3) On the basis of the first two, add (add | update | delete | find) fine mode is proposed
For different operation combinations in 4 in table 1, the 4 types of operation modes automatically generated by the present invention respectively include:
1) when the service only has two operations of add and find, the operation mode is add (add | find) × find;
2) when the service has three operations of add, delete and find, the operation modes are add (add | delete | find) find and add (add | find) find;
3) when the service has three operations of add, update and find, the operation modes are add (add | update | find) × find and add (add | find) × find;
4) when the service has four operations of add, update, delete and find, the operation modes are add (add | update | delete | find) } find, add (add | update | find) } find and add (add | find) } find.
And 4, step 4: automatically generating an API test sequence and a test script based on the operation test mode:
in the step, the operation test mode obtained in the step 3 is used as input, and the output is used as a test script.
And generating an actual operation sequence according to the operation test mode obtained in the step 3. Here we consider the priority of the test patterns as the basis for determining the order of execution of the scripts. The rules defining the priority are based on the number of operation types. The fewer the number of operation types, the higher the priority.
In the step, a specific regular expression is converted into an operation command, but an operation sequence which does not meet the requirement is deleted in advance according to a constraint rule; the constraint rule is as follows:
in an operation sequence, the add number before update should be larger than the delete number, otherwise, the operation sequence is deleted; in an operation sequence, the add number should be larger than the delete number, otherwise, the operation sequence is deleted;
and meanwhile, the value of the number in the operation test mode is determined according to the user requirement, and if the user does not have the requirement, the specific value of the number is (the number of service interfaces contained in add operation + the number of service interfaces contained in update operation + the number of service interfaces contained in delete operation) × 2-2.
After the operation sequence is determined, arranging the API interface sets under each operation label in the operation sequence according to the operation sequence to obtain an API test sequence: the specific API test sequence generation rule is as follows: taking the Cartesian product of the API interface set corresponding to each operation label in the operation sequence as a set of the API test sequence, and sequencing the APIs in the API test sequence according to corresponding positions in the operation sequence; the operating API interface set is derived from the variables AddInterfaceSetMap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap in step 2.
By way of example in step 1, the add operation type includes 2 API interfaces (addProduct and addProduct basic information, respectively), and the find operation type includes 2 API interfaces (displayproducts and showproduct detail, respectively). Thus, a test sequence set containing 4 elements can be obtained through the "add, find" operation mode. The collection element includes: "addProduct, displayproducts", "addProduct, showProducts", "addProducts basic information, showProducts" and "addProducts basic information, showProducts".
Writing each API test sequence into a test script respectively; the test script includes 4 tags and the following attribute values:
the < script > tag contains an attribute resourcesID, and the attribute value is the value of the corresponding service identification id; < script > tag priority is 0.
The < step > tag comprises a path attribute and an operation attribute, wherein the path attribute is a url path of a corresponding interface, and the operation attribute is a corresponding interface operation type; < step > tag priority 1; the tag may comprise a plurality of < param > tags. The order of the tags in the script is consistent with the order in the test sequence for guiding the test sequence.
The system comprises a < param > tag, a parameter name recording parameter name, an attribute recording parameter corresponding attribute in a belonged class, and an attribute value recording corresponding expected result (the attribute exists only when the attribute is in the < response > tag); < param > tag priority 2.
< response > tag, no attribute, priority 1; multiple < param > tags may be included.
And 5: automatically generating test data:
and (4) taking the test script obtained in the step (4) and the parameter ParameteconstrainMap obtained in the step (2) as input, and outputting the test data of each interface. For each test script, there is an interface sequence for generating data. Before the algorithm is executed, 2 reference tables are needed.
1. A static table for enumerating reliable operation types for a particular operation type. As shown in Table 2, the left side of the table represents the operation type (OperaterType), and the right side represents the operation type from the test data generation perspective, and records the operation type (dependedOperaterType) on which the operation depends, for example, the argument of the find operation may be the id returned by the add operation or an attribute in the add operation addition object. That is, dependendoperatertype provides the OperaterType with some or all of the required data.
TABLE 2 static table
Type of operation Dependent operation type
find add,update,delete
update add,find
delete add,find,update
2. And the dynamic table is used for generating dynamic data according to the script. As shown in FIG. 3, the table includes 5 columns, wherein the 1 st column represents the sequential position of the interface in the script, the 2 nd column represents the interface type, the 3 rd column represents the object identification and the related object attribute in the test input data, the 4 th column represents the output of the interface, and the 5 th column represents the test prediction of the interface and records the expected output of the interface.
Table 3 dynamic table example
Figure BDA0001636204190000141
The steps we generate the data are as follows:
step 5.1: generating a static table, and generating columns 1 and 2 of the dynamic table according to the test script obtained in the step 4;
the static table is used for enumerating operation types which are correspondingly depended by the operation types; wherein the operation types on which find depends are add, update, delete; the operation types of update dependence are add and find; the operation types of delete dependence are add, find, update;
the dynamic table comprises 5 columns, wherein the 1 st column represents the sequential position of the interface in the script, the 2 nd column represents the type of the interface, the 3 rd column represents the object identification and the related object attribute in the test input data, the 4 th column represents the output of the interface, the 5 th column represents the test prediction of the interface, and the expected output of the interface is recorded;
step 5.2: and iterating the dynamic table according to the interface sequence in the test script, wherein the iteration rule is as follows:
step a: starting from a first interface, generating data in a dynamic table corresponding to the interface one by one until all data are generated; judging the current interface IiIf the operation type is add, then step b is carried out, otherwise step c is carried out;
step b: current interface IiGenerating test data according to a black box test principle, generating a unique key for each data set, filling the unique key in a 3 rd column in a dynamic table, filling an interface identifier in a 4 th column, and leaving a 5 th column as null; turning to the step a after the completion to generate the data of the next interface;
step c: current interface IiIs not add, the input data of the interface depends on the previous interface IjInput data of j<i, the input data dependency rule is:
following the principle of proximity, giving priority to interface I from the currentiNearest interface Ij(ii) a Interface I is judged by comparing static tablesjIs the interface I or notiCan be relied upon; if not, traversing forwards until a dependable interface is found; if dependent, compare interface IiInput parameters and interface IjIf the interface I is an input parameter or a return parameterjCan satisfy interface IiThe requirement of the input parameter is interface IjAnd interface IiEstablishing data dependency relationship, and connecting interface IjIs filled into interface IiDynamic Table, column 3, with interface IiActual output fill interface IiDynamic Table column 4, expected results fill interface IiB, column 5 of the dynamic table, turning to step a after the dynamic table is completed, and generating data of a next interface; if the interface IjIf the input parameter or the return parameter does not meet the condition, the process goes forward until the next dependent interface is found;
wherein, if the current interface IiIs of operation type delete and the input data is identified as id1Need to connect interface IiAll previous data are identified as id1The data of the interface is empty; if the current interface IiIs update and the input data identification is id1Need to connect interface IiAll previous input data are identified as id1The corresponding data of the interface is changed into the latest data;
step d: finally, test data in the JSON format is generated. The test data is stored in the database, and the data in the table is dynamically changed and is always kept up to date. In addition, log files are required to be recorded in the whole testing process, and changes of testing data are mainly recorded.
The description type, attribute and priority of each tag in XML are given below (tags with small priority value have high priority and tags with large priority value are included)
TABLE 4 description type, Attribute, and priority of individual tags
Figure BDA0001636204190000161
Figure BDA0001636204190000171
And (3) giving a specific file preprocessing algorithm:
Input:fileSet an test Interface Description file set
Output:OperaterTypesMap
AddInterfaceSetMap
DeleteInterfaceSetMap
UpdateInterfaceSetMap
FindInterfaceSetMap
ParameterConstrainsMap
premises
Figure BDA0001636204190000172
Figure BDA0001636204190000181
giving a specific test pattern generation algorithm
Input:OperaterTypesMap
Output:Model a set of regular expressions
Figure BDA0001636204190000182
Figure BDA0001636204190000191
Giving out a specific test script generation algorithm
Input:
AddInterfaceSetMap
DeleteInterfaceSetMap
UpdateInterfaceSetMap
FindInterfaceSetMap
Model the value of*
Output:A series of script files
Figure BDA0001636204190000201
Giving a specific test data generation algorithm
Input:
a series of script files
ParameterConstrainsMap
Output:
script variable contains testData
Figure BDA0001636204190000211
Figure BDA0001636204190000221
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention.

Claims (5)

1. An automatic test method for distributed big data service is characterized in that: the method comprises the following steps:
step 1: establishing an example based on an XML Schema test interface description specification; the description specification of the test interface describes interface information through XML tags, and the content comprises a service information tag, an operation tag, an interface tag, a request tag, a response tag, a dependency tag and a data tag and a constraint type tag of the data tag; the 7 data types supported by the description specification are respectively 3 data types supported by XML Schema: the data type comprises a built-in data type, a simple data type, a complex data type and 4 additional data types which are respectively a file data type, an image data type, an audio data type and a video data type;
step 2: taking the example document based on the description specification of the XML Schema test interface obtained in the step 1 as an input, respectively analyzing each document, and storing information in the description document in the following variables in the analyzing process:
variable operatertypemapp: the type is a key-value set, the key value is the identification id of the service instance, and the value is an operation tag contained in the service instance; this variable is used to store all operation types of the service;
variables addlnterfacesetmap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap: the type is a key-value set, the key value is the identification id of the service instance, and the value is an interface set; the elements in the interface set are: an interface identifier id, a parameter list of request data and response data; each parameter in the parameter list comprises two attributes of a parameter type and a parameter name; the variable is used for storing specific operation instances in various operation types;
the variable ParameterConstrainsMap: the type is a key-value set, the key value is an identification id of the service instance, and the value is a constraint set of the interface; the elements in the constraint set are: interface identification id, parameter constraint list; the parameter constraint list comprises parameter names and constraint conditions; the variable is used for storing the constraint of the parameter of each specific operation;
variables Premises: the type is a set, and the element value in the set is the service identification id of the precursor service of the current service instance;
and step 3: automatically generating an operational test pattern:
taking the OperaterTypeMap obtained in the step 2 as an input, combining the operations contained in the OperaterTypeMap variable according to the identification id of the service instance and the value in the OperaterTypeMap, and taking the obtained regular expression set which accords with the use habit of the user as a corresponding operation test mode of the service instance;
and 4, step 4: automatically generating an API test sequence and a test script based on the operation test mode:
generating an actual operation sequence according to the operation test mode obtained in the step 3; deleting the operation sequences which do not meet the requirements according to the constraint rules; the constraint rule is as follows:
in an operation sequence, the add number before update should be larger than the delete number, otherwise, the operation sequence is deleted; in an operation sequence, the add number should be larger than the delete number, otherwise, the operation sequence is deleted;
after the operation sequence is determined, arranging API interface sets under all operation labels in the operation sequence according to the operation sequence to obtain an API test sequence;
writing each API test sequence into a test script respectively;
and 5: automatically generating test data:
and (3) taking the test script obtained in the step (4) and the parameter ParameteconstrainMap obtained in the step (2) as input, and outputting test data of each interface, wherein the specific steps are as follows:
step 5.1: generating a static table, and generating columns 1 and 2 of the dynamic table according to the test script obtained in the step 4;
the static table is used for enumerating operation types which are correspondingly depended by the operation types; wherein the operation types on which find depends are add, update, delete; the operation types of update dependence are add and find; the operation types of delete dependence are add, find, update;
the dynamic table comprises 5 columns, wherein the 1 st column represents the sequential position of the interface in the script, the 2 nd column represents the type of the interface, the 3 rd column represents the object identification and the related object attribute in the test input data, the 4 th column represents the output of the interface, the 5 th column represents the test prediction of the interface, and the expected output of the interface is recorded;
step 5.2: and iterating the dynamic table according to the interface sequence in the test script, wherein the iteration rule is as follows:
step a: starting from a first interface, generating data in a dynamic table corresponding to the interface one by one until all data are generated;judging the current interface IiIf the operation type is add, then step b is carried out, otherwise step c is carried out;
step b: current interface IiGenerating test data according to a black box test principle, generating a unique key for each data set, filling the unique key in a 3 rd column in a dynamic table, filling an interface identifier in a 4 th column, and leaving a 5 th column as null; turning to the step a after the completion to generate the data of the next interface;
step c: current interface IiIs not add, the input data of the interface depends on the previous interface IjInput data of j<i, the input data dependency rule is:
following the principle of proximity, giving priority to interface I from the currentiNearest interface Ij(ii) a Interface I is judged by comparing static tablesjIs the interface I or notiCan be relied upon; if not, traversing forwards until a dependable interface is found; if dependent, compare interface IiInput parameters and interface IjIf the interface I is an input parameter or a return parameterjCan satisfy interface IiThe requirement of the input parameter is interface IjAnd interface IiEstablishing data dependency relationship, and connecting interface IjIs filled into interface IiDynamic Table, column 3, with interface IiActual output fill interface IiDynamic Table column 4, expected results fill interface IiB, column 5 of the dynamic table, turning to step a after the dynamic table is completed, and generating data of a next interface; if the interface IjIf the input parameter or the return parameter does not meet the condition, the process goes forward until the next dependent interface is found;
wherein, if the current interface IiIs of operation type delete and the input data is identified as id1Need to connect interface IiAll previous data are identified as id1The data of the interface is empty; if the current interface IiIs update and the input data identification is id1Need to connect interface IiAll previous input data are identified as id1Corresponding data change of interfaceIs the latest data;
step d: finally, test data in the JSON format is generated.
2. The automated testing method of distributed big data services according to claim 1, characterized in that:
the XML tag in the step 1 is specifically divided into:
service information tag < resources >: describing the content as service information; the attributes comprise a service identifier id, a service name, a domain name base and a precursor service identifier premise; the priority of the service information label is 0, and the service information label is the root of all the labels;
an operation tag, including < add > < delete > < update > < find >: describing the operation of adding, deleting, updating and searching; the priority is 1; < add > behavior of operation: adding a number of parameters to the service; < delete > behavior of operation: deleting all parameters added by the corresponding < add > operation; action of < update > operation: modifying some or all of the parameters of the corresponding < add >; < find > operation behavior: querying one or more parameter information in the stored service;
interface tag < resource >: information describing the content as an interface; the attribute comprises an interface identifier id, an interface name and a requested actual address path; the priority is 2;
request tag < request > and response tag < response >: describing the user's request and response; the attribute comprises a data type dataType; the priority is 3;
dependency tag < dependency >: describing the operational interface on which the service depends; the attribute comprises id of service corresponding to the parameter name resources and id of interface corresponding to the resource;
data tag < param >: information describing the request parameters; the attribute comprises a parameter name, a corresponding attribute name, a parameter type and whether the parameter is required or not; the priority is 4;
data tag < data >: describing data from the response message; the attribute comprises name data name and type data type; the priority is 4;
data tag < element >: a description container type element; the attribute comprises a data name and a data type; the priority is 5; and under < param >, < data > tags or self-contained;
constraint type tag of data tag < restriction >: describing a constraint condition; under the < param >, < data >, < element > tags.
3. The automated testing method of distributed big data services according to claim 2, characterized in that:
the test script in step 4 includes 4 tags and the following attribute values:
the < script > tag contains an attribute resourcesID, and the attribute value is the value of the corresponding service identification id; < script > tag priority is 0;
the < step > tag comprises a path attribute and an operation attribute, wherein the path attribute is a url path of a corresponding interface, and the operation attribute is a corresponding interface operation type; < step > tag priority 1;
the < param > tag comprises three attributes of name, attribute and value, wherein the attribute name records the name of the parameter, the attribute records the corresponding attribute of the parameter in the class to which the attribute belongs, and the attribute value records the corresponding expected result; < param > tag priority 2;
< response > tag, no attribute, priority 1.
4. The automated testing method of distributed big data services according to claim 2 or 3, characterized in that: when the actual operation sequence is generated according to the operation test mode in step 4, the value of the number in the operation test mode is determined according to the user requirement, and if the user does not have the requirement, the specific value of the number is (the number of service interfaces included in add operation + the number of service interfaces included in update operation + the number of service interfaces included in delete operation) × 2-2.
5. The automated testing method of distributed big data services according to claim 4, characterized in that: the API test sequence generation rule in the step 4 is as follows: taking the Cartesian product of the API interface set corresponding to each operation label in the operation sequence as a set of the API test sequence, and sequencing the APIs in the API test sequence according to corresponding positions in the operation sequence; the operating API interface set is derived from the variables AddInterfaceSetMap, DeleteInterfaceSetMap, UpdateInterfaceSetMap, FindInterfaceSetMap in step 2.
CN201810362027.3A 2018-04-20 2018-04-20 Automatic testing method for distributed big data service Expired - Fee Related CN108540351B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810362027.3A CN108540351B (en) 2018-04-20 2018-04-20 Automatic testing method for distributed big data service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810362027.3A CN108540351B (en) 2018-04-20 2018-04-20 Automatic testing method for distributed big data service

Publications (2)

Publication Number Publication Date
CN108540351A CN108540351A (en) 2018-09-14
CN108540351B true CN108540351B (en) 2021-06-25

Family

ID=63479106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810362027.3A Expired - Fee Related CN108540351B (en) 2018-04-20 2018-04-20 Automatic testing method for distributed big data service

Country Status (1)

Country Link
CN (1) CN108540351B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109460366A (en) * 2018-11-16 2019-03-12 郑州云海信息技术有限公司 A kind of software stability test method, device, equipment and medium
CN111488267B (en) * 2019-01-25 2024-03-12 北京搜狗科技发展有限公司 Interface test script generation method and device and electronic equipment
CN110071844A (en) * 2019-05-14 2019-07-30 广东电网有限责任公司 A kind of detection script creation system, method and relevant apparatus
CN110765667B (en) * 2019-11-28 2020-06-16 深圳市金城保密技术有限公司 Simulation method and system of laser printer
CN114968689B (en) * 2022-08-01 2022-11-01 北京数字光芯集成电路设计有限公司 FPGA device, MIPI protocol layer testing device and method based on FPGA device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222042A (en) * 2011-06-28 2011-10-19 北京新媒传信科技有限公司 Automatic software testing method based on cloud computing
CN103064788A (en) * 2012-12-24 2013-04-24 清华大学 Web service modeling and test method based on interface semantic contract model
CN104866422A (en) * 2015-05-20 2015-08-26 中国互联网络信息中心 Web Service automatic test system and method
CN107124326A (en) * 2017-04-05 2017-09-01 烽火通信科技股份有限公司 A kind of automated testing method and system
WO2018004892A1 (en) * 2016-07-01 2018-01-04 Mcafee, Inc. Cloud assisted behavioral automated testing
CN107643981A (en) * 2017-08-29 2018-01-30 顺丰科技有限公司 A kind of automatic test platform and operation method of polynary operation flow
CN107704395A (en) * 2017-10-24 2018-02-16 武大吉奥信息技术有限公司 One kind is based on cloud platform automatic test implementation and system under Openstack

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8893087B2 (en) * 2011-08-08 2014-11-18 Ca, Inc. Automating functionality test cases

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222042A (en) * 2011-06-28 2011-10-19 北京新媒传信科技有限公司 Automatic software testing method based on cloud computing
CN103064788A (en) * 2012-12-24 2013-04-24 清华大学 Web service modeling and test method based on interface semantic contract model
CN104866422A (en) * 2015-05-20 2015-08-26 中国互联网络信息中心 Web Service automatic test system and method
WO2018004892A1 (en) * 2016-07-01 2018-01-04 Mcafee, Inc. Cloud assisted behavioral automated testing
CN107124326A (en) * 2017-04-05 2017-09-01 烽火通信科技股份有限公司 A kind of automated testing method and system
CN107643981A (en) * 2017-08-29 2018-01-30 顺丰科技有限公司 A kind of automatic test platform and operation method of polynary operation flow
CN107704395A (en) * 2017-10-24 2018-02-16 武大吉奥信息技术有限公司 One kind is based on cloud platform automatic test implementation and system under Openstack

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于XML描述的软件接口测试研究;李亚辉;《中国优秀博硕士学位论文全文数据库 信息科技辑》;20050615;全文 *

Also Published As

Publication number Publication date
CN108540351A (en) 2018-09-14

Similar Documents

Publication Publication Date Title
CN108540351B (en) Automatic testing method for distributed big data service
US10803029B2 (en) Generating javascript object notation (JSON) schema from JSON payloads
US11663033B2 (en) Design-time information based on run-time artifacts in a distributed computing cluster
CN108304201B (en) Object updating method, device and equipment
US9519701B2 (en) Generating information models in an in-memory database system
US9280568B2 (en) Zero downtime schema evolution
US9110686B2 (en) Web client command infrastructure integration into a rich client application
US20200034750A1 (en) Generating artificial training data for machine-learning
US20080034015A1 (en) System and method for automated on demand replication setup
US20200134081A1 (en) Database systems and applications for assigning records to chunks of a partition in a non-relational database system with auto-balancing
Holzschuher et al. Querying a graph database–language selection and performance considerations
US9830385B2 (en) Methods and apparatus for partitioning data
US10394805B2 (en) Database management for mobile devices
CN106951231B (en) Computer software development method and device
WO2019100635A1 (en) Editing method and apparatus for automated test script, terminal device and storage medium
CN110347375B (en) Resource combination type virtual comprehensive natural environment framework and method for virtual test
CN112860777B (en) Data processing method, device and equipment
US11880740B2 (en) Facilitating machine learning configuration
US10262055B2 (en) Selection of data storage settings for an application
Näsholm Extracting data from nosql databases-a step towards interactive visual analysis of nosql data
CN110955801B (en) Knowledge graph analysis method and system for cognos report indexes
CN115329011A (en) Data model construction method, data query method, data model construction device and data query device, and storage medium
US11615061B1 (en) Evaluating workload for database migration recommendations
CN116263659A (en) Data processing method, apparatus, computer program product, device and storage medium
CN104040537A (en) Systems and methods of automatic generation and execution of database queries

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210625