CN109634837B - Automatic test method, device, equipment and storage medium - Google Patents

Automatic test method, device, equipment and storage medium Download PDF

Info

Publication number
CN109634837B
CN109634837B CN201811238787.XA CN201811238787A CN109634837B CN 109634837 B CN109634837 B CN 109634837B CN 201811238787 A CN201811238787 A CN 201811238787A CN 109634837 B CN109634837 B CN 109634837B
Authority
CN
China
Prior art keywords
parameter
case
data
test
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811238787.XA
Other languages
Chinese (zh)
Other versions
CN109634837A (en
Inventor
张鹏
谢鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201811238787.XA priority Critical patent/CN109634837B/en
Publication of CN109634837A publication Critical patent/CN109634837A/en
Application granted granted Critical
Publication of CN109634837B publication Critical patent/CN109634837B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to automatic software testing and discloses an automatic testing method, an automatic testing device, automatic testing equipment and a storage medium, wherein the automatic testing method comprises the following steps: receiving an automatic test request, and reading parameter entering data contained in the automatic test request; searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table; searching a corresponding target test case in the test case library according to the case identification, and executing the target test case to obtain a case execution result; according to the invention, corresponding out-reference data is searched in a pre-configured electronic table according to the acquired in-reference data, and then the data matching is carried out on the case execution result of the test case according to the out-reference data, so that the out-reference data is acquired from an external system and parameter matching is carried out without wall opening operation, and the test efficiency is improved while the smooth running of an automatic test is ensured.

Description

Automatic test method, device, equipment and storage medium
Technical Field
The present invention relates to the field of software testing technologies, and in particular, to an automatic testing method, apparatus, device, and storage medium.
Background
With the rapid development of internet technology and computer software technology, automatic testing is widely applied to the software service industry due to the advantages of improving testing efficiency, shortening regression testing time, releasing manpower and the like. In general, during a development cycle of a project, some omnibearing tests, including automatic tests, need to be performed on a call interface of a program before an application program issues online.
Based on the fact that existing enterprise-level projects or programs are basically distributed cooperation reasons, fire walls of all relevant systems exist in a testing and developing environment in the testing process possibly, wall opening operation is needed, otherwise, access and testing cannot be conducted between the current systems and the relevant systems, but the wall opening operation is complex in executing process, and testing efficiency cannot be effectively guaranteed. Therefore, how to avoid the wall opening operation, ensure the smooth proceeding of the automatic test and improve the test efficiency in the automatic test process becomes a problem to be solved urgently.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide an automatic test method, an automatic test device, automatic test equipment and a storage medium, and aims to solve the technical problem that the existing automatic test technology is complex in test flow and cannot effectively guarantee test efficiency.
To achieve the above object, the present invention provides an automated testing method comprising the steps of:
Receiving an automatic test request, and reading parameter entering data contained in the automatic test request;
Searching out parameter list data and case identifications corresponding to the input parameter data in a preset electronic table;
searching a corresponding target test case in a test case library according to the case identification, and executing the target test case to obtain a case execution result;
and carrying out data matching on the parameter-output data according to the case execution result, and obtaining a matching result.
Preferably, before the step of receiving an automated test request and reading the parameter entry data contained in the automated test request, the method includes:
receiving a parameter configuration instruction, and acquiring a corresponding spreadsheet template according to a template identifier contained in the parameter configuration instruction;
Receiving a to-be-configured table item input based on the electronic form template and configuration parameters corresponding to the to-be-configured table item, wherein the to-be-configured table item comprises a case identifier, a parameter entering list, a parameter exiting list and interface information;
and associating the to-be-configured list item with the configuration parameter, and storing an association result to the electronic form template to obtain a preset electronic form.
Preferably, the step of receiving an automated test request and reading the parameter-entering data contained in the automated test request includes:
Receiving an automatic test request, and reading request parameters contained in the automatic test request;
Detecting whether target parameters which are the same as the parameter of the parameter configuration of the entering parameter exist in the request parameters according to the parameter configuration of the entering parameter corresponding to the entering parameter in the preset electronic table;
If the target parameters exist, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as input data.
Preferably, the step of searching the preset electronic table for the out-list data and the case identifier corresponding to the in-parameter data includes:
searching a target parameter entering list configuration parameter containing the parameter entering data in the preset electronic table;
And determining a case identifier and target interface information corresponding to the input parameter data according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in the preset electronic table according to the target interface information.
Preferably, the step of searching the corresponding target test case in the test case library according to the case identifier and executing the target test case to obtain a case execution result includes:
searching a corresponding target test case in a test case library according to the case identification, wherein the target test case comprises a plurality of interface codes to be tested;
reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information;
and testing the interface codes to be tested according to the testing sequence, and taking the testing result as a case execution result.
Preferably, the step of reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information includes:
reading interface information corresponding to the case identifications from the preset electronic table, and obtaining the interface identifications contained in the interface information;
Sequencing the interface identifiers according to character codes carried in the interface identifiers to obtain an interface identifier sequence;
and determining the testing sequence of the interface codes to be tested corresponding to the interface identifiers according to the interface identifier sequence.
Preferably, the step of performing data matching on the parameter-output data according to the case execution result and obtaining a matching result includes:
Acquiring test data corresponding to the case execution result, and detecting whether data contained in the out-of-reference data exist in the test data;
if yes, judging that the test data and the parameter outputting data are successfully matched, and if not, judging that the test data and the parameter outputting data are failed to be matched.
In addition, to achieve the above object, the present invention also proposes an automated testing apparatus, the apparatus comprising:
The request response module is used for receiving an automatic test request and reading parameter entering data contained in the automatic test request;
The data query module is used for searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table;
The case execution module is used for searching a corresponding target test case in the test case library according to the case identifier, and executing the target test case to obtain a case execution result;
And the result matching module is used for carrying out data matching on the out-of-reference data according to the case execution result and obtaining a matching result.
In addition, to achieve the above object, the present invention also proposes an automated test apparatus comprising: a memory, a processor, and an automated test program stored on the memory and executable on the processor, the automated test program configured to implement the steps of the automated test method as described above.
In addition, to achieve the above object, the present invention also proposes a storage medium having stored thereon an automated test program which, when executed by a processor, implements the steps of the automated test method as described above.
According to the embodiment of the invention, the parameter entering data contained in the automatic test request is read by receiving the automatic test request; searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table; searching a corresponding target test case in the test case library according to the case identification, and executing the target test case to obtain a case execution result; according to the invention, the parameter output data stored in the preset electronic form is searched according to the acquired parameter input data, and then the data matching is carried out on the case execution result of the test case according to the parameter output data, so that the parameter output data is not required to be acquired from an external system through a wall opening operation and matched, and the test efficiency is improved while the smooth running of the automatic test is ensured.
Drawings
FIG. 1 is a schematic diagram of an automated test equipment of a hardware runtime environment in which embodiments of the present invention are directed;
FIG. 2 is a flow chart of a first embodiment of an automated testing method of the present invention;
FIG. 3 is a flow chart of a second embodiment of the automated testing method of the present invention;
FIG. 4 is a flow chart of a third embodiment of an automated testing method of the present invention;
FIG. 5 is a block diagram of an automated test equipment according to a first embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an automated test equipment of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the automated test equipment may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., a wireless FIdelity (WI-FI) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) Memory or a stable Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Those skilled in the art will appreciate that the configuration shown in FIG. 1 is not limiting of an automated test equipment and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system, a data storage module, a network communication module, a user interface module, and an automated test program may be included in the memory 1005 as one type of storage medium.
In the automated test equipment shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 in the automated test equipment of the present invention may be disposed in the automated test equipment, and the automated test equipment invokes the automated test program stored in the memory 1005 through the processor 1001 and executes the automated test method provided by the embodiment of the present invention.
An embodiment of the present invention provides an automated testing method, and referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the automated testing method of the present invention.
In this embodiment, the automated test method includes the following steps:
Step S10: receiving an automatic test request, and reading parameter entering data contained in the automatic test request;
It should be noted that, the execution body of the method of the embodiment may be a virtual test system (referred to as mock system hereinafter) constructed by a developer according to an actual test requirement, where the mock system is used for simulating the data interaction between the associated system and the interface calling system in the actual test.
It should be understood that the mock system in the automated test process may receive an automated test request sent by an interface calling system (e.g., a client or a mobile terminal), and then parse the request to read input parameters (generally referred to as an entry in a software test) included in the request and transmitted by the interface calling system, i.e., the entry data.
Step S20: searching out parameter list data and case identifications corresponding to the input parameter data in a preset electronic table;
It should be noted that, before executing this step, the developer may construct in advance an automated test database storing field data corresponding to different test cases in the form of a spreadsheet, that is, the preset spreadsheet; in this embodiment and the following embodiments, the preset electronic table is preferably an Excel table, and the field data corresponding to each case in the Excel table at least includes: case coding, interface information (such as interface address, interface description, interface type), entering list, exiting list and other list items, the developer can configure corresponding parameters for each list item, for example, configuring corresponding entering list data of { "entering list 1": "value1", "entering list 2": "value2" } "exiting list configuration corresponding exiting list data" [ { "key1": "value1" }, { key2":" value2"} ]".
Specifically, when the developer builds the automated test database, the developer can set the case identifier according to the application scenes of the test cases so as to distinguish the test cases which are independent of each other. The case identifier may be embodied in the form of a case number or a label, where the format of the case number may be defined according to "interface number+scene number", for example, in case number "101001000000", the interface number is "10100", the scene number is "1000000", and the scene number is used to characterize a specific test scene for which the test case is intended.
In a specific implementation, after obtaining the entry data carried in the automatic test request, the mock system can carry out inclusion matching on the entry data according to all entry data preset in the preset electronic table so as to detect whether the entry data successfully matched exist in the preset electronic table, and if so, searching the electronic table for the exit data corresponding to the entry data successfully matched and the case identification.
It should be understood that the matching is performed, that is, it is detected whether at least one of the data included in the entry parameter data is included in any entry column of the Excel table, if so, the matching is successful, for example, the entry parameter data includes three data a, b, and c, if the entry data included in the z-th entry column of the Excel table includes [ a, e, and f ], then the entry data includes data a included in the entry column data [ a, e, and f ], and then the matching is determined to be successful.
Step S30: searching a corresponding target test case in a test case library according to the case identification, and executing the target test case to obtain a case execution result;
It should be appreciated that the test case library may be a data storage space for storing various types of test cases. When a developer builds the test case library, the case identifications corresponding to the test cases can be associated with the storage paths of the test cases, so that when the mock system acquires the case identifications, the storage paths of the target test cases can be quickly acquired according to the association relationship between the case identifications and the storage paths of the test cases, and then the target test cases can be acquired and executed according to the storage paths, so that case execution results can be obtained.
It can be understood that the test case, i.e. the test case, refers to a description of a specific software product for performing a test task, and embodies a test scheme, a method, a technology and a policy.
Step S40: and carrying out data matching on the parameter-output data according to the case execution result, and obtaining a matching result.
In a specific implementation, the mock system acquires test data corresponding to the case execution result, and then detects whether data contained in the parameter outputting data all exist in the test data; if yes, judging that the test data and the parameter outputting data are successfully matched (namely, successful test), and if not, judging that the test data and the parameter outputting data are failed to be matched (namely, failed test).
Further, the automated testing method of the present embodiment further includes, before the step S10:
The steps are as follows: receiving a parameter configuration instruction, and acquiring a corresponding spreadsheet template according to a template identifier contained in the parameter configuration instruction;
It is appreciated that the spreadsheet template may be a "blank" spreadsheet (e.g., an Excel form) for parameter configuration, and the template identification may be character information capable of distinguishing between different document formats (Word, excel, TXT) of the electronic templates.
In a specific implementation, the mock system responds to the received parameter configuration instruction, analyzes and acquires a template identifier contained in the parameter configuration instruction, and then acquires a corresponding spreadsheet template from a template database according to the template identifier.
The steps are as follows: receiving a to-be-configured table item input based on the electronic form template and configuration parameters corresponding to the to-be-configured table item, wherein the to-be-configured table item comprises a case identifier, a parameter entering list, a parameter exiting list and interface information;
After the electronic form template is obtained, the mock system can receive the to-be-configured list items input by the developer based on the electronic form template and configuration parameters corresponding to the to-be-configured list items. In this embodiment, the table entry to be configured includes at least four types of table entries including case identifier, input parameter sequence, output parameter sequence, and interface information, and of course, in this embodiment, the interface information may be further divided into sub-entries including interface address, interface description, interface type, and the like. Correspondingly, the configuration parameters, that is, the parameter data corresponding to each to-be-configured table item, for example, the configuration parameters corresponding to the "case identification" table item are "10100100, 101002000, 101003000"; the configuration parameters corresponding to the list entry are "{" enter 1":" value1"," enter 2":" value2"}, {" enter 3":" value3"," enter 4":" value4"}, etc.
The steps are as follows: and associating the to-be-configured list item with the configuration parameter, and storing an association result to the electronic form template to obtain a preset electronic form.
It should be noted that, the table items to be configured are associated with the configuration parameters, and the association result is saved to the electronic table template, that is, after each table item to be configured corresponds to the configuration parameter corresponding to the table item to be configured one by one, the configuration parameter is written into the electronic table template by the mock system to obtain the preset electronic table.
In the embodiment, by receiving an automatic test request, the parameter entering data contained in the automatic test request is read; searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table; searching a corresponding target test case in the test case library according to the case identification, and executing the target test case to obtain a case execution result; according to the case execution result, data matching is carried out on the out-parameter data, and the matching result is obtained, and as the out-parameter data stored in the preset electronic form is searched according to the acquired in-parameter data, then the data matching is carried out on the case execution result of the test case according to the out-parameter data, the out-parameter data is not required to be obtained from an external system through wall opening operation and matched, and the test efficiency is improved while the smooth running of the automatic test is ensured.
Referring to fig. 3, fig. 3 is a flow chart of a second embodiment of the automated testing method of the present invention.
Based on the first embodiment, in this embodiment, the step S10 may include:
step S101: receiving an automatic test request, and reading request parameters contained in the automatic test request;
It should be understood that the request parameter may be field data similar to "input parameter" carried in an automatic test request, and taking an automatic test of "user login" as an example, the request parameter acquired by the mock system may include 4 field data of { "name": "zhangsan", "gene": "man", "age":18, "addr": "Ch ina" }.
Further, in order to facilitate the construction, update and maintenance of the automated test database (i.e. the preset electronic table), in this embodiment, when an automated test case library based on an Excel table is constructed, the request parameters actually received by the mock system are not configured into the parameter configuration parameters corresponding to the parameter configuration columns of the Excel table, but a certain number of parameters are selected from the received request parameters to be used as parameter configuration parameters to write into the Excel table.
Step S102: detecting whether target parameters which are the same as the parameter of the parameter configuration of the entering parameter exist in the request parameters according to the parameter configuration of the entering parameter corresponding to the entering parameter in the preset electronic table;
In a specific implementation, after obtaining the request parameter, the mock system may search the parameter configuration parameter of the entering parameter containing the request parameter in the parameter configuration parameter of the entering parameter corresponding to the entering parameter in the preset electronic table, so as to detect whether the target parameter identical to the parameter configuration parameter of the entering parameter exists in the request parameter. For example, the request parameters input by the user include A, B, C, D, E, F six parameters, the configuration parameters of the input parameter corresponding to the x-th row of input parameter in the Excel table include [ M, N, R, X, Y ], the configuration parameters of the input parameter corresponding to the y-th row of input parameter include [ B, C, R, D, Y ], and it can be detected that the target parameter "B, C, D" which is the same as the configuration parameters of the input parameter corresponding to the y-th row of input parameter exists in the request parameters.
Step S103: if the target parameters exist, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as input data.
It should be understood that, in order to reduce the workload of the mock system when searching the corresponding parameter data according to the input parameters, and improve the testing efficiency, in this embodiment, when the mock system detects that the target parameter identical to the input parameter configuration parameter exists in the request parameters, the mock system obtains the target parameter, selects a preset number of parameters from the target parameter as input parameter data, for example, selects parameter B, C from target parameter 'B, C, D' as input parameter data.
Further, the step S20 in this embodiment may include:
step S201: searching a target parameter entering list configuration parameter containing the parameter entering data in the preset electronic table;
in a specific implementation, after selecting a preset number of entry data from the target parameters, the mock system may search the preset electronic table for the target entry configuration parameters including the entry data, where, of course, the target entry configuration parameters may also be determined and obtained by the mock system according to the detection result when executing the step S102.
Step S202: and determining a case identifier and target interface information corresponding to the input parameter data according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in the preset electronic table according to the target interface information.
It should be understood that, in the preset electronic table, a plurality of different case numbers are generally configured under the "case identification" table entry, and each row where the case number is located corresponds to parameters configured by other table entries (for example, an input parameter, an output parameter, and interface information), for example, a case column identification is "101001000000", a corresponding input parameter is "{ input 1": value1"," input 2": value2" } ", a corresponding output parameter is" [ { "key1": value1"}, { key2": value2"} ]", and a corresponding interface information is configured as an interface address "case entry address", an interface description "interface-normally obtains a parameter", an interface type "first type", and so on.
In a specific implementation, the mock system determines a case identifier and target interface information corresponding to the target entry parameter in a preset electronic form according to the target entry parameter found from the preset electronic form, and then queries corresponding exit parameter data from the preset electronic form according to the interface information (for example, interface description).
It should be noted that, in the actual situation, a plurality of out-of-reference data may be corresponding to the same case identifier, so in this embodiment, the out-of-reference data is queried from the preset electronic table by using the interface information, so as to ensure accurate acquisition of the out-of-reference data.
According to the embodiment, by receiving an automatic test request, reading request parameters contained in the automatic test request, detecting whether target parameters which are the same as the parameter configuration parameters of the entering parameter exist in the request parameters according to the parameter configuration parameters of the entering parameter corresponding to the entering parameter in the preset electronic form, if so, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as entering parameter data; then searching a target parameter entering list configuration parameter containing parameter entering data in a preset electronic table; and then determining a case identifier and target interface information corresponding to the input parameter according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in a preset electronic table according to the target interface information, thereby reducing the workload of the mock system and simultaneously ensuring the accurate acquisition of the output parameter data.
Referring to fig. 4, fig. 4 is a flow chart of a third embodiment of the automated testing method of the present invention.
Based on the above embodiments, in this embodiment, the step S30 specifically includes:
Step S301: searching a corresponding target test case in a test case library according to the case identification, wherein the target test case comprises a plurality of interface codes to be tested;
in a specific implementation, the mock system acquires a storage path of a target test case according to an association relationship between a pre-established case identifier and the storage path of the test case, and then acquires the target test case according to the storage path. The target test case comprises a plurality of interface codes to be tested, and each section of interface code to be tested carries an interface identifier which can distinguish the section of code from other codes.
Step S302: reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information;
It should be noted that, in order to ensure automation of code testing, a developer may write, in advance, a character code capable of characterizing the testing sequence in an interface identifier included in the interface information according to the code testing sequence, so that a subsequent mock system may determine the testing sequence of each section of the interface code to be tested according to the character code carried in the interface identifier.
In a specific implementation, the mock system can read interface information corresponding to the case identifier from the preset electronic table to obtain an interface identifier contained in the interface information; sequencing the interface identifiers according to character codes carried in the interface identifiers to obtain an interface identifier sequence; and determining the testing sequence of the interface codes to be tested corresponding to the interface identifiers according to the interface identifier sequence. For example, the target test case includes three sections of interface codes to be tested, and the interface identifiers corresponding to the interface codes to be tested a, b and c are jk-0a01, jk-0b02 and jk-0c03 respectively, so that the interface identifiers "jk-0a01, jk-0b02 and jk-0c03" can be sequenced according to the character codes "0a01, 0b02 and 0c03" carried in the interface identifiers to obtain an interface identifier sequence jk-0a01 x jk-0b02 x jk-0c03, and then the test sequence of the three sections of interface codes to be tested a, b and c is determined according to the interface identifier sequence.
Step S303: and testing the interface codes to be tested according to the testing sequence, and taking the testing result as a case execution result.
In a specific implementation, after determining the test sequence of each interface code to be tested, the mock system can test the interface code to be tested according to the test sequence, obtain a test result and use the test result as an execution result of the whole target test case.
According to the embodiment, the corresponding target test case is searched in the test case library according to the case identification, interface information corresponding to the case identification is read from the preset electronic table, and the test sequence of the interface codes to be tested is determined according to the read interface information; and then, the interface codes to be tested are tested according to the test sequence, so that the smooth execution of the test cases is ensured, and finally obtained test data is prevented from being inconsistent with the actual test data.
In addition, the embodiment of the invention also provides a storage medium, wherein an automatic test program is stored on the storage medium, and the automatic test program realizes the steps of the automatic test method when being executed by a processor.
Referring to fig. 5, fig. 5 is a block diagram illustrating a first embodiment of an automated test equipment according to the present invention.
As shown in fig. 5, an automated testing apparatus according to an embodiment of the present invention includes:
A request response module 501, configured to receive an automatic test request, and read parameter entering data included in the automatic test request;
The data query module 502 is configured to search a preset electronic table for the out-parameter data and the case identifier corresponding to the in-parameter data;
A case execution module 503, configured to find a corresponding target test case in a test case library according to the case identifier, and execute the target test case to obtain a case execution result;
and the result matching module 504 is configured to perform data matching on the parameter-output data according to the case execution result, and obtain a matching result.
In the embodiment, by receiving an automatic test request, the parameter entering data contained in the automatic test request is read; searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table; searching a corresponding target test case in the test case library according to the case identification, and executing the target test case to obtain a case execution result; according to the case execution result, data matching is carried out on the out-parameter data, and the matching result is obtained, and as the out-parameter data stored in the preset electronic form is searched according to the acquired in-parameter data, then the data matching is carried out on the case execution result of the test case according to the out-parameter data, the out-parameter data is not required to be obtained from an external system through wall opening operation and matched, and the test efficiency is improved while the smooth running of the automatic test is ensured.
Based on the first embodiment of the automatic test equipment of the present invention, a second embodiment of the automatic test equipment of the present invention is provided.
In this embodiment, the automated test apparatus further includes: the form configuration module is used for receiving a parameter configuration instruction and acquiring a corresponding electronic form template according to a template identifier contained in the parameter configuration instruction; receiving a to-be-configured table item input based on the electronic form template and configuration parameters corresponding to the to-be-configured table item, wherein the to-be-configured table item comprises a case identifier, a parameter entering list, a parameter exiting list and interface information; and associating the to-be-configured list item with the configuration parameter, and storing an association result to the electronic form template to obtain a preset electronic form.
Further, the request response module 501 is further configured to receive an automatic test request, and read a request parameter included in the automatic test request; detecting whether target parameters which are the same as the parameter of the parameter configuration of the entering parameter exist in the request parameters according to the parameter configuration of the entering parameter corresponding to the entering parameter in the preset electronic table; if the target parameters exist, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as input data.
Further, the data query module 502 is further configured to search the preset electronic table for a target entry configuration parameter including the entry data; and determining a case identifier and target interface information corresponding to the input parameter data according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in the preset electronic table according to the target interface information.
Further, the case execution module 503 is further configured to search a corresponding target test case in a test case library according to the case identifier, where the target test case includes a plurality of interface codes to be tested; reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information; and testing the interface codes to be tested according to the testing sequence, and taking the testing result as a case execution result.
Further, the case execution module 503 is further configured to read interface information corresponding to the case identifier from the preset electronic table, and obtain an interface identifier included in the interface information; sequencing the interface identifiers according to character codes carried in the interface identifiers to obtain an interface identifier sequence; and determining the testing sequence of the interface codes to be tested corresponding to the interface identifiers according to the interface identifier sequence.
Further, the result matching module 504 is further configured to obtain test data corresponding to the case execution result, and detect whether data included in the parameter outputting data are all in the test data; if yes, judging that the test data and the parameter outputting data are successfully matched, and if not, judging that the test data and the parameter outputting data are failed to be matched.
Other embodiments or specific implementations of the automated test equipment of the present invention may refer to the above method embodiments, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. read-only memory/random-access memory, magnetic disk, optical disk), comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (5)

1. An automated testing method, the method comprising:
Receiving an automatic test request, and reading parameter entering data contained in the automatic test request;
Searching out parameter list data and case identifications corresponding to the input parameter data in a preset electronic table;
searching a corresponding target test case in a test case library according to the case identification, and executing the target test case to obtain a case execution result;
Carrying out data matching on the parameter-output data according to the case execution result, and obtaining a matching result;
wherein, before the step of receiving an automated test request and reading the parameter entering data contained in the automated test request, the method comprises the following steps:
receiving a parameter configuration instruction, and acquiring a corresponding spreadsheet template according to a template identifier contained in the parameter configuration instruction;
Receiving a to-be-configured table item input based on the electronic form template and configuration parameters corresponding to the to-be-configured table item, wherein the to-be-configured table item comprises a case identifier, a parameter entering list, a parameter exiting list and interface information;
Associating the to-be-configured list item with the configuration parameter, and storing an association result to the electronic form template to obtain a preset electronic form;
the step of receiving an automatic test request and reading the parameter entering data contained in the automatic test request comprises the following steps:
Receiving an automatic test request, and reading request parameters contained in the automatic test request;
Detecting whether target parameters which are the same as the parameter of the parameter configuration of the entering parameter exist in the request parameters according to the parameter configuration of the entering parameter corresponding to the entering parameter in the preset electronic table;
If the target parameters exist, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as input parameter data;
the step of searching the out-of-reference column data and the case identifier corresponding to the in-parameter data in a preset electronic table comprises the following steps:
searching a target parameter entering list configuration parameter containing the parameter entering data in the preset electronic table;
Determining a case identifier and target interface information corresponding to the input parameter data according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in the preset electronic form according to the target interface information;
The step of searching the corresponding target test case in the test case library according to the case identification and executing the target test case to obtain a case execution result comprises the following steps:
searching a corresponding target test case in a test case library according to the case identification, wherein the target test case comprises a plurality of interface codes to be tested;
reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information;
Testing the interface codes to be tested according to the testing sequence, and taking the testing result as a case execution result;
the step of reading interface information corresponding to the case identifier from the preset electronic table and determining the test sequence of the interface code to be tested according to the read interface information comprises the following steps:
reading interface information corresponding to the case identifications from the preset electronic table, and obtaining the interface identifications contained in the interface information;
Sequencing the interface identifiers according to character codes carried in the interface identifiers to obtain an interface identifier sequence;
and determining the testing sequence of the interface codes to be tested corresponding to the interface identifiers according to the interface identifier sequence.
2. The method of claim 1, wherein the step of performing data matching on the parameter-exiting data according to the case execution result and obtaining a matching result comprises:
Acquiring test data corresponding to the case execution result, and detecting whether data contained in the out-of-reference data exist in the test data;
if yes, judging that the test data and the parameter outputting data are successfully matched, and if not, judging that the test data and the parameter outputting data are failed to be matched.
3. An automated test equipment, the equipment comprising:
The request response module is used for receiving an automatic test request and reading parameter entering data contained in the automatic test request;
The data query module is used for searching out the parameter list data and the case identification corresponding to the parameter data in a preset electronic table;
The case execution module is used for searching a corresponding target test case in the test case library according to the case identifier, and executing the target test case to obtain a case execution result;
The result matching module is used for carrying out data matching on the out-of-reference data according to the case execution result and obtaining a matching result;
The request response module is further used for receiving a parameter configuration instruction and acquiring a corresponding spreadsheet template according to a template identifier contained in the parameter configuration instruction;
Receiving a to-be-configured table item input based on the electronic form template and configuration parameters corresponding to the to-be-configured table item, wherein the to-be-configured table item comprises a case identifier, a parameter entering list, a parameter exiting list and interface information;
Associating the to-be-configured list item with the configuration parameter, and storing an association result to the electronic form template to obtain a preset electronic form;
The request response module is specifically configured to:
Receiving an automatic test request, and reading request parameters contained in the automatic test request;
Detecting whether target parameters which are the same as the parameter of the parameter configuration of the entering parameter exist in the request parameters according to the parameter configuration of the entering parameter corresponding to the entering parameter in the preset electronic table;
If the target parameters exist, acquiring the target parameters, and selecting a preset number of parameters from the target parameters as input parameter data;
the data query module is specifically configured to:
searching a target parameter entering list configuration parameter containing the parameter entering data in the preset electronic table;
Determining a case identifier and target interface information corresponding to the input parameter data according to the target input parameter configuration parameter, and inquiring corresponding output parameter data in the preset electronic form according to the target interface information;
The step of searching the corresponding target test case in the test case library according to the case identification and executing the target test case to obtain a case execution result comprises the following steps:
searching a corresponding target test case in a test case library according to the case identification, wherein the target test case comprises a plurality of interface codes to be tested;
reading interface information corresponding to the case identifier from the preset electronic table, and determining the test sequence of the interface code to be tested according to the read interface information;
Testing the interface codes to be tested according to the testing sequence, and taking the testing result as a case execution result;
the step of reading interface information corresponding to the case identifier from the preset electronic table and determining the test sequence of the interface code to be tested according to the read interface information comprises the following steps:
reading interface information corresponding to the case identifications from the preset electronic table, and obtaining the interface identifications contained in the interface information;
Sequencing the interface identifiers according to character codes carried in the interface identifiers to obtain an interface identifier sequence;
and determining the testing sequence of the interface codes to be tested corresponding to the interface identifiers according to the interface identifier sequence.
4. An automated test equipment, the equipment comprising: a memory, a processor and an automated test program stored on the memory and executable on the processor, the automated test program configured to implement the steps of the automated test method of any of claims 1 to 2.
5. A storage medium having stored thereon an automated test program which, when executed by a processor, implements the steps of the automated test method of any of claims 1 to 2.
CN201811238787.XA 2018-10-23 2018-10-23 Automatic test method, device, equipment and storage medium Active CN109634837B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811238787.XA CN109634837B (en) 2018-10-23 2018-10-23 Automatic test method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811238787.XA CN109634837B (en) 2018-10-23 2018-10-23 Automatic test method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109634837A CN109634837A (en) 2019-04-16
CN109634837B true CN109634837B (en) 2024-05-28

Family

ID=66066591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811238787.XA Active CN109634837B (en) 2018-10-23 2018-10-23 Automatic test method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109634837B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061637B (en) * 2019-12-13 2023-08-18 广州品唯软件有限公司 Interface testing method, interface testing device and storage medium
CN111190093A (en) * 2020-01-10 2020-05-22 上海知白智能科技有限公司 Chip testing method and device
CN111427778A (en) * 2020-03-18 2020-07-17 中国平安人寿保险股份有限公司 Test method, test device, terminal equipment and storage medium
CN111382081B (en) * 2020-03-27 2023-04-25 中国建设银行股份有限公司 Entry verification test method and device
CN113535538B (en) * 2020-04-21 2023-06-16 网联清算有限公司 Method, device, electronic equipment and storage medium for automatically testing application full link
CN111581083B (en) * 2020-04-26 2024-02-09 抖音视界有限公司 Interface testing method and device, electronic equipment and storage medium
CN112000582A (en) * 2020-08-31 2020-11-27 深圳市奇虎智能科技有限公司 Server-side automatic test early warning method, device, equipment and storage medium
CN112052182A (en) * 2020-09-27 2020-12-08 平安信托有限责任公司 Interface automation test method and device, computer equipment and storage medium
CN112100079B (en) * 2020-11-02 2022-04-12 北京淇瑀信息科技有限公司 Test method and system based on simulation data calling and electronic equipment
CN112711538B (en) * 2020-12-30 2024-03-19 招商局金融科技有限公司 Test system simulation method, device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9111030B1 (en) * 2008-10-03 2015-08-18 Federal Home Loan Mortgage Corporation Systems and methods for testing a software application
CN106528393A (en) * 2015-09-09 2017-03-22 北京京东尚科信息技术有限公司 Method and device for Mock testing of WebService
CN107665171A (en) * 2017-10-11 2018-02-06 中国民生银行股份有限公司 Automatic regression test method and device
CN107861870A (en) * 2017-11-02 2018-03-30 平安科技(深圳)有限公司 Interface testing and test data generating method, device, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9111030B1 (en) * 2008-10-03 2015-08-18 Federal Home Loan Mortgage Corporation Systems and methods for testing a software application
CN106528393A (en) * 2015-09-09 2017-03-22 北京京东尚科信息技术有限公司 Method and device for Mock testing of WebService
CN107665171A (en) * 2017-10-11 2018-02-06 中国民生银行股份有限公司 Automatic regression test method and device
CN107861870A (en) * 2017-11-02 2018-03-30 平安科技(深圳)有限公司 Interface testing and test data generating method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN109634837A (en) 2019-04-16

Similar Documents

Publication Publication Date Title
CN109634837B (en) Automatic test method, device, equipment and storage medium
CN107656874B (en) Interface testing method and device, simulation baffle and system
CN110413506B (en) Test case recommendation method, device, equipment and storage medium
CN109117363B (en) Test case generation method and device and server
CN108563768B (en) Data conversion method, device, equipment and storage medium for different data models
CN108984389B (en) Application program testing method and terminal equipment
CN110955409B (en) Method and device for creating resources on cloud platform
US20140113257A1 (en) Automated evaluation of programming code
CN110321284B (en) Test data entry method, device, computer equipment and storage medium
CN111475390A (en) Log collection system deployment method, device, equipment and storage medium
CN109614325B (en) Method and device for determining control attribute, electronic equipment and storage medium
CN113448862B (en) Software version testing method and device and computer equipment
CN112052169A (en) Test management method, system, device and computer readable storage medium
CN111367531B (en) Code processing method and device
CN113704110A (en) Automatic testing method and device for user interface
CN111736951A (en) Simulation method for automatic driving, computer device, and storage medium
CN111427784A (en) Data acquisition method, device, equipment and storage medium
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN111045720B (en) Code management method, code management system, server and medium
CN112988578A (en) Automatic testing method and device
CN110688173B (en) Positioning method and device of components in cross-platform interface framework and electronic equipment
CN112433935A (en) Test method, test device, electronic equipment and storage medium
CN112114866A (en) Data conversion loading method and device of JSON file and storage medium
CN113158177A (en) Dynamic measurement method, device, equipment and storage medium
CN110717315A (en) System data batch modification method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant