CN115599683A - Automatic testing method, device, equipment and storage medium - Google Patents

Automatic testing method, device, equipment and storage medium Download PDF

Info

Publication number
CN115599683A
CN115599683A CN202211321519.0A CN202211321519A CN115599683A CN 115599683 A CN115599683 A CN 115599683A CN 202211321519 A CN202211321519 A CN 202211321519A CN 115599683 A CN115599683 A CN 115599683A
Authority
CN
China
Prior art keywords
test
layer
script
result
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211321519.0A
Other languages
Chinese (zh)
Inventor
王闪闪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202211321519.0A priority Critical patent/CN115599683A/en
Publication of CN115599683A publication Critical patent/CN115599683A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • G06F11/079Root cause analysis, i.e. error or fault diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to an automatic test method and device, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: constructing a test demand analysis model; identifying a plurality of test layers corresponding to the elements to be tested based on the test demand analysis model and the attribute information of the elements to be tested; configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy which correspond to the test layer; and calling the test script corresponding to the test layer, and running the corresponding test script through the test layer to test the test layer to obtain a test result. By the mode, the development and maintenance cost of the automatic test system is greatly reduced, the labor force of developers is liberated, complicated operation flow steps are omitted, the development time is saved, and the test efficiency is greatly improved.

Description

Automatic testing method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of science and technology, in particular to an automatic testing method and device, electronic equipment and a computer readable storage medium.
Background
The testing of the computer software is an important ring in the development process of the computer software, the quality of the computer software can be guaranteed, the computer software delivered to a user is guaranteed to be normal and reliable software, and the user experience of the computer software is improved. The testing of the computer software usually needs to be completed in the development period of the computer software, so that the improvement of the testing efficiency of the computer software has important significance for shortening the development period of the computer software.
At present, the realization of the test automation of computer software is a recognized means for improving the test efficiency in the computer field. However, the automation of testing of computer software is currently very dependent on the testing environment of the computer software. For example, a testing environment of computer software usually includes a plurality of systems, and if a functional point of the plurality of systems in the computer software is to be tested, a large amount of automated testing scripts need to be written for each system involved to complete a data link of the functional point, so that the automated testing of the functional point can be performed. Because the development, maintenance and execution of the automatic test script can only be performed by professional testers, and much time is required for the testers, the development and maintenance cost is high.
Therefore, how to reduce the cost of the automated testing is a problem to be solved at present, and becomes a technical problem to be solved at present.
Disclosure of Invention
The invention mainly aims to provide an automatic test method, an automatic test device, computer equipment and a storage medium, and aims to solve the technical problem of high development and maintenance cost of automatic tests.
In order to achieve the above object, the present invention provides an automated testing method, wherein the method includes: constructing a test demand analysis model; identifying a plurality of test layers corresponding to the elements to be tested based on the test demand analysis model and the attribute information of the elements to be tested; configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy corresponding to the test layer; and calling a test script corresponding to the test layer, and running the corresponding test script through the test layer to test the test layer to obtain a test result.
In one embodiment, the building of the test requirement analysis model includes: acquiring demand factor data, wherein the demand factor data comprises attribute information and a layering result of a plurality of elements; quantizing each attribute information in the demand factor data to obtain a quantized value of each attribute information; taking the quantized value and the layering result of each attribute information as input, and constructing and obtaining a machine learning model; and performing supervised training on the machine learning model by adopting a requirement factor data training sample set, and iteratively optimizing the influence factors of each attribute information to obtain the test requirement analysis model.
In one embodiment, the identifying, based on the test requirement analysis model and the attribute information of the element to be tested, a plurality of test layers corresponding to the element to be tested includes: inputting attribute information of an element to be tested into the test requirement analysis model; and the test demand analysis model outputs a layering result, wherein the layering result comprises a plurality of test layers corresponding to the elements to be tested.
In one embodiment, the method further comprises: acquiring code data corresponding to a test layer and test data generated in a test; generating head information of a test script template according to the code data corresponding to the test layer and preset request information; generating request information of the test script template according to the code data corresponding to the test layer and the test data generated in the test; and generating a test script according to the head information of the test script template and the request information of the test script template.
In one embodiment, the step of determining whether to configure the AI property information data source includes: the test layer includes unit test layer, API test layer, UI test layer in proper order, call with the test script that the test layer corresponds, through the test layer operation corresponds the test script is in order to right the test layer tests, obtains the test result, includes: calling a unit test script, scanning whether the unit test layer executes the unit test, if not, executing the unit test, and outputting a unit test result; calling an API test script, creating an API interface according to test information in the API test layer, registering the API interface according to a preset interface specification, testing and verifying the registered API interface, and outputting an API test result; and calling a UI test script to carry out UI test on the UI test layer, acquiring an execution result fed back by the UI test layer, comparing the execution result with an expected result, if the comparison result is consistent, the UI test is passed, if the comparison result is inconsistent, the UI test is not passed, and outputting the UI test result.
In one embodiment, the calling a test script corresponding to the test layer, and running the corresponding test script through the test layer to test the test layer, so as to obtain a test result, where the method further includes: acquiring at least one solution corresponding to the vulnerability information in the test result, and sending the at least one solution to a user terminal; and receiving a target solution screened by the user terminal based on at least one solution, and repairing the vulnerability according to the target solution.
In order to achieve the above object, the present invention provides an automatic testing apparatus, wherein the apparatus comprises: distributing the test modules; the distribution test module comprises the following units: the construction unit is used for constructing a test requirement analysis model; the identification unit is used for identifying a plurality of test layers corresponding to the elements to be tested based on the test requirement analysis model and the attribute information of the elements to be tested; the configuration unit is used for configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy corresponding to the test layer; and the calling unit is used for calling the test script corresponding to the test layer, and running the corresponding test script through the test layer so as to test the test layer to obtain a test result.
In one embodiment, the apparatus further comprises: the basic test management module is used for managing basic data of the whole automatic test process; and the special test management module is used for carrying out targeted test on the elements to be tested, and the targeted test comprises at least one of a performance test and an interface robustness test.
To achieve the above object, the present invention provides a computer device, wherein the computer device includes: a memory storing at least one instruction; and a processor executing the instructions stored in the memory to implement some or all of the steps in the automated testing method in the above method embodiments.
To achieve the above object, the present invention provides a computer-readable storage medium, wherein at least one instruction is stored in the computer-readable storage medium, and the at least one instruction is executed by a processor in a computer device to implement part or all of the steps in the automated testing method in the above method embodiment.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the embodiment of the invention, a plurality of test layers corresponding to the elements to be tested can be identified by introducing the test requirement analysis model, and automatic testing is carried out by calling the test scripts corresponding to the test layers. In the process, if the test fails, the failure reasons can be accurately positioned, the pertinence is strong, the failure reasons do not need to be integrally eliminated from the whole element to be tested, and the positioning efficiency of the test failure reasons can be effectively improved; the problem of exposure is solved in advance, and the layering test can ensure rapidness and can feed back in advance; the bug search and repair time is shortened, the time and cost are saved, the automation test development difficulty can be reduced by means of the test demand analysis model, the automation test coverage can be realized through the automation quality assurance of different levels, the development and maintenance cost of the automation test system is greatly reduced, the labor force of developers is liberated, the complicated operation flow steps are omitted, the development time is saved, and the test efficiency is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of an application environment of an automated testing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an automated testing method according to a second embodiment of the present invention;
FIG. 3 is a flow chart of an automated testing method provided by a third embodiment of the invention;
FIG. 4 is a flow chart of an automated testing method according to a fourth embodiment of the present invention;
FIG. 5 is a flow chart of an automated testing method according to a fifth embodiment of the present invention;
FIG. 6 is a flowchart of an automated testing method according to a sixth embodiment of the present invention;
FIG. 7 is a flowchart of an automated testing method according to a seventh embodiment of the present invention;
fig. 8 is a schematic structural diagram of an automatic test apparatus according to an eighth embodiment of the present invention;
fig. 9 is a schematic structural diagram of a computer device according to a ninth embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present invention and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
It should be understood that, the sequence numbers of the steps in the following embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
The automated testing method provided by the embodiment of the invention can be applied to the application environment shown in fig. 1, wherein a client communicates with a server. The client includes, but is not limited to, a palm top computer, a desktop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cloud terminal device, a Personal Digital Assistant (PDA), and other computer devices. The server can be implemented by an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 2, a flowchart of an automated testing method according to a second embodiment of the present invention is shown, where the automated testing method can be applied to the client in fig. 1. As shown in fig. 2, the automated testing method may include the steps of:
step S201, a test requirement analysis model is constructed.
In this step, the test requirement analysis model is a model obtained through machine learning training, and after the machine learning training, the corresponding relationship between the attribute information of the element and the hierarchical result of the element is learned. The test requirement analysis model can predict the layering result according to the input attribute information of the element to be tested.
Further, the test requirement analysis model can be constructed by adopting a neural network, and a requirement factor data training sample set is adopted for model training. The topological structure of the neural network for constructing the test requirement analysis model consists of an input layer, an intermediate layer and an output layer. Wherein the intermediate layer is also called hidden layer, which can be one or more layers, wherein: the input layer is the only data input inlet of the whole neural network, and is mainly used for defining different types of data input and facilitating the quantization processing of other parts; the hidden layer is mainly used for carrying out nonlinear processing on data input by the input layer, and the prediction capability of the model can be effectively ensured by carrying out nonlinear fitting on the input data on the basis of an excitation function; the output layer is the only output of the whole model after the hidden layer and is used for carrying out output representation on the result processed by the hidden layer.
Step S202, based on the test requirement analysis model and the attribute information of the elements to be tested, a plurality of test layers corresponding to the elements to be tested are identified.
In this embodiment, the elements to be tested may be an application program, a User Interface (UI), a web page, an operating system, and a program code. The test requirement analysis model can be divided into a plurality of test layers according to the architecture of the element to be tested, and different test scripts can be called to carry out different functional tests according to different test layers.
Step S203, configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy corresponding to the test layer.
In this embodiment, a test script refers to a series of instructions for a particular test that can be executed by the automated test tool. A test script is computer readable instructions that automate a test procedure (or a portion of a test procedure). The test strategy comprises the type and the number of the executed test scripts, the trigger condition for executing each test script and the like.
In this embodiment, the test scripts in each test scheme are concurrently run according to the respective corresponding test strategies, but each test scheme is sequentially run according to a preset execution sequence. And calling or starting a plurality of test scripts according to the time length for executing the test scripts, the type for executing the test scripts and the trigger condition for executing each test script in the test strategy. The execution state of the test script includes: pass, queue, block, skip, fail, alert, close, under inspection, allocated and assigned, resolved, ended, non-reproducible, rejected, or inapplicable test scripts.
And step S204, calling a test script corresponding to the test layer, and running the corresponding test script through the test layer to test the test layer to obtain a test result.
In this embodiment, the test is to verify each function of the test layer and check whether the element to be tested meets the function required by the user. For example, whether each link in the web page has a corresponding page is checked, whether the pages can be correctly switched, whether functions of updating, canceling, deleting, saving and the like of buttons in the pages can normally operate, and the like.
According to the embodiment of the invention, a plurality of test layers corresponding to the elements to be tested can be identified by introducing the test requirement analysis model, and automatic testing is carried out by calling the test scripts corresponding to the test layers. In the process, if the test fails, the failure reasons can be accurately positioned, the pertinence is strong, the failure reasons do not need to be integrally eliminated from the whole element to be tested, and the positioning efficiency of the test failure reasons can be effectively improved; the problem of exposure is solved in advance, and the layering test can ensure rapidness and can feed back in advance; the bug search and repair time is shortened, the time and cost are saved, the automation test development difficulty can be reduced by means of the test demand analysis model, the automation test coverage can be realized through the automation quality assurance of different levels, the development and maintenance cost of the automation test system is greatly reduced, the labor force of developers is liberated, the complicated operation flow steps are omitted, the development time is saved, and the test efficiency is greatly improved.
Referring to fig. 3, which is a schematic flow chart of an automated testing method provided by the third embodiment of the present invention, the step S201 further includes:
step S301, obtaining demand factor data.
Wherein the demand factor data includes attribute information and a layering result of a plurality of elements.
Step S302, performing quantization processing on each attribute information in the demand factor data, and obtaining a quantized value of each attribute information.
Step S303, the quantized values and the layering results of the attribute information are used as input, and a machine learning model is constructed and obtained.
Step S304, supervised training is carried out on the machine learning model by adopting a requirement factor data training sample set, influence factors of each attribute information are iteratively optimized, and the test requirement analysis model is obtained.
In this embodiment, the algorithm for performing supervised training on the machine learning model may include: k neighbor algorithm, decision tree, naive Bayes, and logic regression.
Referring to fig. 4, which is a schematic flow chart of an automated testing method according to a fourth embodiment of the present invention, step S202 includes:
step S401, inputting the attribute information of the element to be tested into the test requirement analysis model.
Step S402, the test requirement analysis model outputs a layering result, and the layering result comprises a plurality of test layers corresponding to the elements to be tested.
In this embodiment, different test scripts can be called through different test layers to perform different functional tests. After a plurality of test layers corresponding to the elements to be tested are identified, the terminal calls test scripts corresponding to the test layers through the automatic test platform. The test scripts corresponding to different test layers are different.
Referring to fig. 5, a schematic flow chart of an automated testing method according to a fifth embodiment of the present invention is shown, where the method further includes:
step S501, obtaining code data corresponding to the test layer and test data generated in the test.
In this embodiment, the code data refers to a source file written in a language supported by a development tool, and is a set of rule system in which a test flow is expressed in a discrete form by a character string, where the character string may be an english character, a symbolic character, or the like. The development tool is special software used for establishing application software for a software package, a software framework, a hardware platform, an operating system and the like of a test process.
Step S502, generating head information of the test script template according to the code data corresponding to the test layer and the preset request information.
In this embodiment, the header information is divided into three parts, i.e., user-defined information, request header management information, and request default information. The user-defined information refers to the version, test time, user-defined variable and other information which are input by the user and related to the performance test. The request header management information refers to the content of a request header contained in a request sent when accessing the system in the performance test process, and records relevant information of a user terminal sending the request, such as a character set acceptable by a browser, the text length of a test request message and the like, so as to ensure the correctness of the request. Requesting default information refers to default values requested during performance testing, such as request access paths, ports, protocols, and the like. The request may be an HTTP (HyperText Transfer Protocol) request.
Step S503, generating the request information of the test script template according to the code data corresponding to the test layer and the test data generated in the test.
In this embodiment, the request information refers to specific content of the test request, for example, a login request is initiated to the web server in the process of testing the web page to be tested, and a button is clicked to jump to a specified web page. During the performance test, a plurality of test requests are generated, and one test request may be sent only once or may be sent multiple times.
Step S504, generating a test script according to the head information of the test script template and the request information of the test script template.
In this embodiment, a test script template is generated according to the generated header information and the preset request information, and then a test script is generated according to the test script template.
In this embodiment, the header information of the test script template is generated through the code data corresponding to the test flow and the preset request information, and the request information of the test script template is generated according to the code data corresponding to the test flow and the test data generated in the test. The manual input is not needed before the performance test of each time, so that the time for writing the test script is greatly reduced, and the test efficiency is improved.
Referring to fig. 6, which is a schematic flow chart of an automated testing method according to a sixth embodiment of the present invention, where the test layer sequentially includes a unit test layer, an API test layer, and a UI test layer, step S204 includes:
step S601, calling a unit test script, scanning whether the unit test layer executes the unit test, if not, executing the unit test, and outputting a unit test result.
In this embodiment, the unit test is a white-box test, which is intended to detect errors and bugs in the program code, to ensure the quality of the application program from the source, and to perform a coverage test on each line of code or code block of the application program to determine whether the line of code or code block is correct. Generally, a test case is used for testing and verifying a source program code, a test report is generated by compiling, and errors and bugs in the source program code are judged by interpreting the test report. The format of the test result may be html, txt or other formats. The content of the test result may include the name of the test package, the name of the test class, the number of test runs, the number of defects, the success rate, the test status, the time spent on the test, the detailed defect details, and so on.
Step S602, calling an API test script, creating an API interface according to the test information in the API test layer, registering the API interface according to a preset interface specification, testing and verifying the registered API interface, and outputting an API test result.
Step S603, calling a UI test script to carry out UI test on the UI test layer, obtaining an execution result fed back by the UI test layer and comparing the execution result with an expected result, if the comparison result is consistent, the UI test is passed, if not, the UI test is not passed, and outputting the UI test result.
In this step, the UI test script is used to perform UI interface test on the elements to be tested, and the UI interface test can test whether the layout of the functional modules on the UI interface is reasonable, whether the overall style is consistent, whether the placement positions of the controls conform to the usage habits of the customers, more importantly, whether the operation is convenient, the navigation is simple and easy to understand, whether the characters in the interface are correct, whether the naming is uniform, whether the page is beautiful, whether the combination of the characters and the pictures is perfect, and the like.
Referring to fig. 7, which is a schematic flow chart of an automated testing method according to a seventh embodiment of the present invention, after step S204, the method further includes:
and S701, acquiring at least one solution corresponding to the vulnerability information in the test result, and sending the at least one solution to the user terminal.
Step S702, receiving a target solution screened by the user terminal based on at least one solution, and repairing the vulnerability according to the target solution.
Fig. 8 is a block diagram of an automated testing apparatus according to a third embodiment of the present invention, which is applied to a client. For convenience of explanation, only portions related to the embodiments of the present invention are shown.
Referring to fig. 8, the automated testing apparatus 80 includes: a distribution test module 81, a basic test management module 82 and a special test management module 83.
And the basic test management module 82 is used for managing basic data of the whole automatic test process.
The special test management module 83 is configured to perform a targeted test on the element to be tested, where the targeted test includes at least one of a performance test and an interface robustness test.
The distribution test module 81 includes:
the building unit 801 is used for building a test requirement analysis model;
the identifying unit 802 is configured to identify a plurality of test layers corresponding to the elements to be tested based on the test requirement analysis model and attribute information of the elements to be tested;
a configuration unit 803, configured to configure a test scheme according to the test layer, where the test scheme includes a test script and a test policy corresponding to the test layer;
the calling unit 804 is configured to call a test script corresponding to the test layer, and run the corresponding test script through the test layer to test the test layer to obtain a test result;
a test script generating unit 805, configured to obtain code data corresponding to a test layer and test data generated in a test; generating head information of a test script template according to the code data corresponding to the test layer and preset request information; generating request information of the test script template according to the code data corresponding to the test layer and the test data generated in the test; generating a test script according to the head information of the test script template and the request information of the test script template;
a bug fixing unit 806, configured to obtain at least one solution corresponding to the bug information in the test result, and send the at least one solution to the user terminal; and receiving a target solution screened by the user terminal based on at least one solution, and repairing the repair according to the target solution.
The construction unit 801 is further configured to obtain requirement factor data, where the requirement factor data includes attribute information and a layering result of a plurality of elements; quantizing each attribute information in the demand factor data to obtain a quantized value of each attribute information; taking the quantized values and the layering results of the attribute information as input, and constructing and obtaining a machine learning model; and performing supervised training on the machine learning model by adopting a requirement factor data training sample set, and iteratively optimizing the influence factors of each attribute information to obtain the test requirement analysis model.
The identifying unit 802 is further configured to input attribute information of an element to be tested into the test requirement analysis model; and the test demand analysis model outputs a layering result, wherein the layering result comprises a plurality of test layers corresponding to the elements to be tested.
The calling unit 804 is further configured to call a unit test script, scan whether the unit test layer has executed the unit test, if not, execute the unit test, and output a unit test result; calling an API test script, creating an API interface according to test information in the API test layer, registering the API interface according to a preset interface specification, testing and verifying the registered API interface, and outputting an API test result; and calling a UI test script to carry out UI test on the UI test layer, acquiring an execution result fed back by the UI test layer, comparing the execution result with an expected result, if the comparison result is consistent, the UI test is passed, if the comparison result is inconsistent, the UI test is not passed, and outputting the UI test result.
It should be noted that, because the above-mentioned information interaction between the modules and units, the execution process, and other contents are based on the same concept, and the specific functions and technical effects thereof are based on the same concept, reference may be made to the section of the method embodiment specifically, and details are not described here.
Fig. 9 is a schematic structural diagram of a computer device according to a ninth embodiment of the present invention. As shown in fig. 9, the computer apparatus of this embodiment includes: at least one processor (only one shown in fig. 9), a memory, and a computer program stored in the memory and executable on the at least one processor, the processor when executing the computer program implementing the steps in any of the various item recommendation method embodiments described above.
The computer device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 9 is merely an example of a computing device and is not intended to limit the computing device, which may include more or fewer components than those shown, or some of the components may be combined, or different components may be included, such as a network interface, a display screen, and input devices, etc.
The Processor may be a CPU, and the Processor may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory includes readable storage medium, internal memory, etc., where the internal memory may be a memory of the computer device, and the internal memory provides an environment for the operating system and the execution of computer-readable instructions in the readable storage medium. The readable storage medium may be a hard disk of the computer device, and in other embodiments may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the computer device. Further, the memory may also include both internal storage units and external storage devices of the computer device. The memory is used for storing an operating system, application programs, a BootLoader (BootLoader), data, and other programs, such as program codes of a computer program, and the like. The memory may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the device is divided into different functional units or modules, so as to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working processes of the units and modules in the above-mentioned apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. The integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method of the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and used for instructing relevant hardware, and when the computer program is executed by a processor, the steps of the above method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code, recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In some jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and proprietary practices.
The present invention may also be implemented by a computer program product, which when executed on a computer device, enables the computer device to implement all or part of the processes in the method according to the above embodiments.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided by the present invention, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above detailed description of the automated testing method, apparatus, computer device and storage medium provided by the present invention, and the specific examples applied herein have been provided to illustrate the principle and implementation manner of the present invention, and the above examples are only used to illustrate the technical solution of the present invention, but not to limit the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (10)

1. An automated testing method, the method comprising:
constructing a test demand analysis model;
identifying a plurality of test layers corresponding to the elements to be tested based on the test demand analysis model and the attribute information of the elements to be tested;
configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy corresponding to the test layer;
and calling a test script corresponding to the test layer, and running the corresponding test script through the test layer to test the test layer to obtain a test result.
2. The automated testing method of claim 1, wherein said constructing a test requirements analysis model comprises:
acquiring demand factor data, wherein the demand factor data comprises attribute information and a layering result of a plurality of elements;
quantizing each attribute information in the demand factor data to obtain a quantized value of each attribute information;
taking the quantized value and the layering result of each attribute information as input, and constructing and obtaining a machine learning model;
and performing supervised training on the machine learning model by adopting a requirement factor data training sample set, and iteratively optimizing the influence factors of each attribute information to obtain the test requirement analysis model.
3. The automated testing method of claim 1, wherein identifying a number of test layers corresponding to an element to be tested based on the test requirement analysis model and attribute information of the element to be tested comprises:
inputting attribute information of an element to be tested into the test requirement analysis model;
and the test demand analysis model outputs a layering result, wherein the layering result comprises a plurality of test layers corresponding to the elements to be tested.
4. The automated testing method of claim 1, wherein the method further comprises:
acquiring code data corresponding to a test layer and test data generated in a test;
generating head information of a test script template according to the code data corresponding to the test layer and preset request information;
generating request information of the test script template according to the code data corresponding to the test layer and the test data generated in the test;
and generating a test script according to the head information of the test script template and the request information of the test script template.
5. The automated testing method of claim 1, wherein the testing layer sequentially comprises a unit testing layer, an API testing layer, and a UI testing layer, and the calling of the testing script corresponding to the testing layer runs the corresponding testing script through the testing layer to test the testing layer to obtain the testing result, comprising:
calling a unit test script, scanning whether the unit test layer executes the unit test or not, if not, executing the unit test, and outputting a unit test result;
calling an API test script, creating an API interface according to test information in the API test layer, registering the API interface according to a preset interface specification, testing and verifying the registered API interface, and outputting an API test result;
and calling a UI test script to carry out UI test on the UI test layer, acquiring an execution result fed back by the UI test layer, comparing the execution result with an expected result, if the comparison result is consistent, the UI test is passed, if the comparison result is inconsistent, the UI test is not passed, and outputting the UI test result.
6. The automated testing method of claim 1, wherein the invoking of the test script corresponding to the test layer runs the corresponding test script through the test layer to test the test layer, and after obtaining the test result, the method further comprises:
obtaining at least one solution corresponding to the vulnerability information in the test result, and sending the at least one solution to a user terminal;
and receiving a target solution screened by the user terminal based on at least one solution, and repairing the vulnerability according to the target solution.
7. An automated testing apparatus, the apparatus comprising: distributing the test modules;
the distributed test module comprises the following units:
the construction unit is used for constructing a test requirement analysis model;
the identification unit is used for identifying a plurality of test layers corresponding to the elements to be tested based on the test requirement analysis model and the attribute information of the elements to be tested;
the configuration unit is used for configuring a test scheme according to the test layer, wherein the test scheme comprises a test script and a test strategy corresponding to the test layer;
and the calling unit is used for calling the test script corresponding to the test layer, and running the corresponding test script through the test layer so as to test the test layer to obtain a test result.
8. The automated testing apparatus of claim 7, wherein the apparatus further comprises:
the basic test management module is used for managing basic data of the whole automatic test process;
and the special test management module is used for carrying out targeted test on the elements to be tested, and the targeted test comprises at least one of a performance test and an interface robustness test.
9. A computer device, characterized in that the computer device comprises: a memory storing at least one instruction; and
a processor executing instructions stored in the memory to implement the automated testing method of any of claims 1 to 6.
10. A computer-readable storage medium having stored therein at least one instruction for execution by a processor in a computer device to implement an automated testing method according to any one of claims 1 to 6.
CN202211321519.0A 2022-10-26 2022-10-26 Automatic testing method, device, equipment and storage medium Pending CN115599683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211321519.0A CN115599683A (en) 2022-10-26 2022-10-26 Automatic testing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211321519.0A CN115599683A (en) 2022-10-26 2022-10-26 Automatic testing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115599683A true CN115599683A (en) 2023-01-13

Family

ID=84850181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211321519.0A Pending CN115599683A (en) 2022-10-26 2022-10-26 Automatic testing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115599683A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116931911A (en) * 2023-06-15 2023-10-24 明物数智科技研究院(南京)有限公司 Intelligent low-code application development platform and development method based on AIGC

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116931911A (en) * 2023-06-15 2023-10-24 明物数智科技研究院(南京)有限公司 Intelligent low-code application development platform and development method based on AIGC

Similar Documents

Publication Publication Date Title
US10108535B2 (en) Web application test script generation to test software functionality
CN108628748B (en) Automatic test management method and automatic test management system
CN112199300B (en) Interface testing method and device, electronic equipment and storage medium
CN110825619A (en) Automatic generation method and device of interface test case and storage medium
US11449414B2 (en) Mapping test parameter data elements during heterogeneous component-based testing in a portable automation framework in both API mode and UI mode
CN106293798B (en) Self-repairing method and system of electronic device and server
CN112241360A (en) Test case generation method, device, equipment and storage medium
CN112052172A (en) Rapid testing method and device for third-party channel and electronic equipment
CN112540924A (en) Interface automation test method, device, equipment and storage medium
CN103649924A (en) Embedded apparatus, program generation apparatus, and program
CN115599683A (en) Automatic testing method, device, equipment and storage medium
CN112988578A (en) Automatic testing method and device
US11269712B1 (en) Customized categorial error handling framework for heterogeneous component-based testing in a portable automation framework
US20220066915A1 (en) Controlling heterogeneous component-based testing in a portable automation framework with test scripts in both api mode and ui mode
CN117632710A (en) Method, device, equipment and storage medium for generating test code
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN111767218A (en) Automatic testing method, equipment and storage medium for continuous integration
US11734134B2 (en) Automatically locating resources using alternative locator expressions during heterogeneous component-based testing in a portable automation framework
CN115934559A (en) Testing method of intelligent form testing system
CN116016270A (en) Switch test management method and device, electronic equipment and storage medium
JP2023000907A (en) Source code correction support device and source code correction support method
CN112182552A (en) Real-name authentication method and device, electronic equipment and storage medium
KR102111392B1 (en) Test unified administration system and Controlling Method for the Same
CN113094281B (en) Test method and device for hybrid App
US11310680B2 (en) Reusing provisioned resources during heterogeneous component-based testing in a portable automation framework

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination