CN109062780B - Development method of automatic test case and terminal equipment - Google Patents

Development method of automatic test case and terminal equipment Download PDF

Info

Publication number
CN109062780B
CN109062780B CN201810658805.3A CN201810658805A CN109062780B CN 109062780 B CN109062780 B CN 109062780B CN 201810658805 A CN201810658805 A CN 201810658805A CN 109062780 B CN109062780 B CN 109062780B
Authority
CN
China
Prior art keywords
test
user
test case
page
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810658805.3A
Other languages
Chinese (zh)
Other versions
CN109062780A (en
Inventor
梁砾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Vispractice Technology Co ltd
Original Assignee
Shenzhen Vispractice Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Vispractice Technology Co ltd filed Critical Shenzhen Vispractice Technology Co ltd
Priority to CN201810658805.3A priority Critical patent/CN109062780B/en
Publication of CN109062780A publication Critical patent/CN109062780A/en
Application granted granted Critical
Publication of CN109062780B publication Critical patent/CN109062780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention is suitable for the technical field of computers, and provides a development method of an automatic test case and terminal equipment, wherein the method comprises the following steps: if a user logging in the production system is monitored, user characteristic information is obtained; if the user performs page operation in the production system, recording the operation track of the user and acquiring page input and output data; matching the operation track with page operation metadata to generate a page operation code; determining a page operation data subset performed by a user according to the page input and output data and the associated data; generating a test case according to the characteristic information, the page operation code and the data subset of the user and a preset user operation process, and determining the data subset corresponding to the test case; the test cases and the data subsets are distributed to a plurality of simulation operation ends to execute the test cases, test results are received, an automatic process from case generation to execution completion is achieved, the working strength of testers is reduced, and the purpose of improving the test effect is achieved.

Description

Development method of automatic test case and terminal equipment
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a development method of an automatic test case and terminal equipment.
Background
At present, the traditional software testing work is mainly that a tester writes a test case according to a requirement document, completes a testing task manually, and records a testing result. Although, with the advance of technology, an automatic testing means has been gradually introduced, and the repetitive regression testing work can be performed by an automatic testing method, under the current testing framework, the testing work still requires a great deal of manual operations performed by testers, which mainly includes the following two aspects:
first, when formulating a test case, a requirement document needs to be manually converted into a test case. In order to avoid the possibility that information in the process from a requirement document to a test case is distorted, but the requirement document cannot reflect the real operation intention of a user, the current test case simulates the operation habits of the user by covering all the operations of the user as much as possible, the usability of the system is checked, and the workload of testers is increased.
Second, in preparing test data, a large amount of test data needs to be simulated for testing. The general situation is that a test environment which is the same as the production environment is established for testing, and the mode firstly causes huge test data and can not realize effective focusing on the data required by a specific test case; and secondly, due to the fact that data does not have independence, abnormal conditions occur when the test cases use the data in a crossed mode or multiple regression tests are carried out, and the test effect is influenced. And if each test is solved by full reduction of the data, the test time and the workload of the tester are increased invisibly.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method for developing an automatic test case and a terminal device, so as to solve the problems that a lot of manual operations are required by a tester in the existing test case development, the test effect is not good, and the workload of the tester is increased.
A first aspect of an embodiment of the present invention provides a method for developing an automatic test case, including:
if the fact that a user logs in a production system is monitored, acquiring characteristic information of the user according to the login information of the user, and monitoring whether the user performs page operation on the production system;
if the user performs page operation on the production system, recording the operation track of the user, and acquiring input and output data of a page in the process of performing page operation by the user;
matching the operation track with prestored page operation metadata to generate a page operation code;
determining a data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page;
generating a test case according to the characteristic information of the user, the page operation code and the data subset, and a preset user operation process, and determining the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to a plurality of managed simulation operation ends to execute the test cases, and receiving test results returned by the simulation operation ends.
Optionally, the distributing the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases includes:
determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to the corresponding simulation operation ends to execute the test cases according to the managed operation conditions of the simulation operation ends and the operational capacity and storage space required by the test cases.
Optionally, the data subset corresponding to the test case includes expected output data of the test case;
the development method of the automatic test case further comprises the following steps:
and determining the test cases which pass the test in the test cases according to the expected output data of the test cases and the test results returned by the plurality of simulation operation ends.
Optionally, the method for developing an automatic test case further includes:
and querying the associated data in a database of the production system by taking the input and output data of the page as keywords.
A second aspect of the embodiments of the present invention provides a device for developing an automatic test case, including:
the system comprises a user characteristic information acquisition unit, a production system monitoring unit and a display unit, wherein the user characteristic information acquisition unit is used for acquiring the characteristic information of a user according to the login information of the user and monitoring whether the user performs page operation on the production system if the user logs in the production system;
the page operation processing unit is used for recording the operation track of the user and acquiring the input and output data of a page in the page operation process of the user if the user performs the page operation in the production system;
the page operation code generating unit is used for matching the operation track with pre-stored page operation metadata to generate a page operation code;
the data subset determining unit is used for determining the data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page;
the case generating unit is used for generating a test case according to the characteristic information of the user, the page operation code and the data subset and a preset user operation process, and determining the data subset corresponding to the test case;
and the task distribution unit is used for distributing the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases and receiving the test results returned by the multiple simulation operation ends.
Optionally, the task distributing unit includes:
the resource determining unit required by the test case is used for determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case;
and the test case distribution unit is used for distributing the test cases and the data subsets corresponding to the test cases to the corresponding simulation operation ends to execute the test cases according to the operation conditions of the plurality of managed simulation operation ends and the operational capacity and the storage space required by the operation of the test cases, and receiving the test results returned by the plurality of simulation operation ends.
Optionally, the data subset corresponding to the test case includes expected output data of the test case;
the development device for the automatic test case further comprises:
and the test case detection unit is used for determining the test cases which pass the test in the test cases according to the expected output data of the test cases and the test results returned by the plurality of simulation operation ends.
Optionally, the development apparatus for an automated test case further includes:
and the associated data query unit is used for querying the associated data in a database of the production system by taking the input and output data of the page as a keyword.
A third aspect of the embodiments of the present invention provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the computer program.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: the embodiment of the invention effectively collects the characteristic information of all users on the production system, the operation tracks in the page operation process and the page input and output data, regularly processes the operation characteristics of different types of users, generates the page operation code by combining the page operation metadata, determines the data subsets of the user for performing the page operation, then simulates the user operation flows with different scenes and different characteristics to form the test case, deploys the test case and the data subsets corresponding to the test case to a plurality of independent simulation operation ends to complete the test work, obtains the test result, realizes the automatic flow from the case generation to the execution completion of the test work, and lightens the working strength of testers, thereby achieving the purposes of improving the test efficiency and improving the test effect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a method for developing an automated test case according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for developing an automated test case according to another embodiment of the present invention;
FIG. 3 is a schematic flow chart diagram of a method for developing an automated test case according to yet another embodiment of the present invention;
FIG. 4 is a schematic flow chart diagram of a method for developing an automated test case according to another embodiment of the present invention;
FIG. 5 is a schematic block diagram of an apparatus for developing an automatic test case according to an embodiment of the present invention;
FIG. 6 is a schematic block diagram of an apparatus for developing an automatic test case according to another embodiment of the present invention;
fig. 7 is a schematic block diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Referring to fig. 1, fig. 1 is a schematic flow chart of a development method of an automated test case according to an embodiment of the present invention, in the embodiment, a description is given by taking triggering from the perspective of a big data analysis platform as an example, where the big data analysis platform is based on a big data architecture and integrates data acquisition, data analysis, and the like. As shown in fig. 1, in this embodiment, the processing procedure of the big data analysis platform may include the following steps:
s101: if the fact that the user logs in the production system is monitored, the characteristic information of the user is obtained according to the login information of the user, and whether the user performs page operation on the production system is monitored.
Here, a real-time monitoring and acquisition channel from a big data analysis platform to a page of a production system and a database is firstly established, wherein the production system refers to an information system which supports daily business operation under normal conditions and can include production data, a production data processing system, a production network and the like.
The big data analysis platform monitors whether a user logs in the production system or not, if the user logs in the production system, the login information of the logged-in user is obtained, the characteristic information of the user can be inquired in a database of the production system according to the obtained login information, here, the database of the production system can prestore the corresponding relation between the user login information and the user characteristic information, and the characteristic information of the user corresponding to the obtained login information is determined according to the relation, wherein the characteristic information of the user can comprise user role, user grade, user region, user gender, user age, login time, login duration and the like.
S102: if the user performs the page operation on the production system, recording the operation track of the user, and acquiring the input and output data of the page in the process of performing the page operation by the user.
Specifically, if it is monitored that the user performs the page operation in the production system, the big data analysis platform records the operation track of the user in real time, including: inputting characters, clicking a button, dragging a mouse, clicking a mouse, double clicking a mouse, right-clicking menu selection and the like, wherein the big data analysis platform has the advantage of a calculation-intensive task and is mainly used for processing a large amount of page operation information generated in the step. The extracted data can be flexibly converted into calculation, combination, split and the like.
Meanwhile, when the big data analysis platform records the operation track of the user in real time, the big data analysis platform can also acquire input and output data of the page in the page operation process of the user, and the method comprises the following steps: pages fill in data, check out data, query data, modify data, delete data, and so forth.
S103: and matching the operation track with pre-stored page operation metadata to generate a page operation code.
Here, after obtaining the operation trajectory of the user, the big data analysis platform matches with the page operation metadata in the platform to generate a page operation code, for example: the metadata is also called intermediary data and relay data, and is data describing data, and here mainly refers to a set of operation action data obtained by performing feature extraction and classification on an operation or multiple operations having commonality in a user page operation, that is, data or code describing a user operation. Code is a source file written by a programmer in a language supported by a development tool, and is a set of explicit rules for representing information in discrete form by characters, symbols, or signal symbols.
Specifically, taking a user logging in a system as an example, basic operation steps required to be completed on a page include:
1. entering a user name
2. Inputting a password
3. Click login button
Firstly, to realize the automatic test, the page operation code (written by development languages such as python and ruby) of the user needs to be simulated, that is, the three steps need to be subjected to code conversion. Then, the code is encapsulated to form a function named 'login system', and the function is referred to as: user name, password. This function, which is one of the "page operation metadata", may be pre-stored in the system. In practical application, after the user performs the three steps of operations on the page, the system recognizes the operation trajectory of the user, for example, the page captures the url of the user on the login page, the user inputs the url in the user name input box and the password input box, the login button is clicked, the system considers that the user completes the operation of logging in the system, and the system matches the operation trajectory of the operation of the user with the page operation metadata of the "login system". Finally, when the automatic test case code is generated, simulating the code of a user logging in the system, and calling the page operation metadata of the 'logging in system', namely calling the code in the function of the 'logging in system'. And when the automatic test case is operated, the codes are correspondingly executed.
Similarly, different page operations can be extracted into page operation metadata or be independent into a function module, so that different test cases can be conveniently called, the code amount is reduced, and the maintenance is convenient. Also similar page operation metadata may include:
1. inputting query conditions, checking query results and editing
2. Inputting query conditions, checking query results and deleting
3. Newly creating an object, including on the page, after the object attribute is filled in, selected, pulled down and selected, clicking the determined button
4. Reading the contents of multiple pages of forms, and checking the records matched with the keywords
5. Uploading files, clicking browse button, selecting uploaded files, clicking upload button
......
The page operation of the user is decomposed into the operations of adding, deleting, modifying and checking different objects on different pages, and the corresponding operations of all the users can be encapsulated into page operation metadata. The user logs in the system, completes the operation, logs out of the system, and can be matched with the combination of the metadata of a plurality of page operations.
S104: and determining a data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page.
Specifically, the big data analysis platform obtains the input and output data of the page in the page operation process of the user, and queries the associated data in the database of the production system by taking the data as a keyword to form a data subset for the page operation of the user.
Here, the big data analysis platform may perform structured associated storage on the acquired feature information of the user, the generated page operation code, and the determined data subset of the user performing the page operation, so as to complete processing such as data normalization and duplicate removal, thereby facilitating subsequent processing.
S105: and generating a test case according to the characteristic information of the user, the page operation code and the data subset and a preset user operation process, and determining the data subset corresponding to the test case.
The big data analysis platform customizes user operation flows of different scenes and different feature types on the basis of the feature information, the page operation codes and the data subsets of the users, and generates test cases and the data subsets in specific scenes.
Specifically, the big data analysis platform performs analysis according to test requirements, such as: the method comprises the steps of simulating user operation processes with different scenes and different characteristics in a certain VIP level male user operation characteristic, user operation characteristic under a certain concurrent condition in a certain specific time period and the like in a certain area, forming a test case according to the characteristic information, page operation codes and data subsets of the users, and determining the data subsets corresponding to the test case, wherein the data subsets take the collected data subsets as the basis, and comprise test case input data, expected output data and the like, and the data subsets serve as the input of the test case during operation and the standard for judging whether the test case passes or not after the operation.
S106: and distributing the test cases and the data subsets corresponding to the test cases to a plurality of managed simulation operation ends to execute the test cases, and receiving test results returned by the simulation operation ends.
The big data analysis platform analyzes resources required by the test cases, acquires the state of each simulation operation end, judges whether the resources of the simulation operation ends meet test requirements or not, flexibly allocates the resources by adopting a virtual container technology, distributes the test cases to each simulation operation end, receives the test cases and corresponding data subsets by the simulation operation ends, respectively operates the test cases by each simulation operation end, and returns test results to the big data analysis platform.
Specifically, after generating a test case and determining a data subset corresponding to the test case, the big data analysis platform analyzes the operational capability required for running the test case according to the running conditions of the managed multiple simulation running ends, such as: CPU core count, memory size, etc., and storage space, such as: and hard disk space and the like, performing resource scheduling, and distributing the test cases and the corresponding data subsets to the corresponding simulation operation ends, wherein the logic of each simulation operation end is independent, and the operated test cases and the operated data subsets are exclusive for each simulation operation end and do not influence each other. In a specific embodiment, the simulation running end does not need a physical server, and can fully utilize a virtual container technology to establish a docker as the simulation running end in real time in a cloud computing resource pool formed by a plurality of servers. And the simulation running end receives the test cases and the corresponding data subsets distributed by the big data analysis platform, completes the execution of the test cases and obtains the test results. After all the test cases are completed by each simulation operation end, the test results are returned to the big data analysis platform for summarizing, if necessary, the test cases and the data subsets deployed on the simulation operation ends can be deleted, the docker resources are recycled, and the recycled resources are distributed to other simulation operation ends for use. Here, the simulation operation end may include a use case distribution test module and a data distribution storage module, where the use case distribution test module of the simulation operation end receives a test case distributed by the big data analysis platform, and cooperates with the data distribution storage module of the simulation operation end to complete execution of the test case to obtain a test result, where the data distribution storage module receives a data subset corresponding to the test case distributed by the big data analysis platform.
From the above description, it can be seen that the method for developing the automatic test case according to the embodiment of the present invention collects the page operations and the page data of the user, and automatically generates the test case and the corresponding test data subset of the specific user or the specific scene in combination with the big data analysis, so that the test case does not need to be manually generated, and the workload of the tester is reduced. And a plurality of simulation operation ends are managed in a distributed mode, test tasks are automatically distributed to the simulation operation ends to be executed according to the operational capacity and the storage space required by the test cases, the unified management and control distributed management automatic execution of test programs and test data is achieved, the test efficiency is improved, and the resource utilization efficiency is improved by adopting a virtual container technology. Meanwhile, different from one set of test data in the past test environment, the embodiment of the invention provides a plurality of sets of test data according to different test scenes and test cases, and the test data are isolated by different simulation operation ends, so that the test can be simultaneously carried out, the test effect is not influenced, and the test accuracy is improved.
Referring to fig. 2, fig. 2 is a schematic flow chart of a method for developing an automatic test case according to another embodiment of the present invention. The embodiment corresponding to fig. 1 differs in that: the distributing the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases, and receiving the test results returned by the multiple simulation operation ends may include S206. S201 to S205 are the same as S101 to S105 in the previous embodiment, and please refer to the description related to S101 to S105 in the previous embodiment, which is not repeated herein. Specifically, S206 may include S2061 to S2062:
s2061: and determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case.
S2062: according to the managed running conditions of the plurality of simulation running ends, the operational capability and the storage space required by the running test cases, the test cases and the data subsets corresponding to the test cases are distributed to the corresponding simulation running ends to execute the test cases, and test results returned by the plurality of simulation running ends are received.
Here, the test cases and the data subsets corresponding to the test cases are distributed to the corresponding simulation operation ends to execute the test cases according to the resources required by the test cases and the data subsets and the states of the simulation operation ends, so that the test success rate is improved, and the method is suitable for practical application.
Referring to fig. 3, fig. 3 is a schematic flowchart of a method for developing an automatic test case according to still another embodiment of the present invention. The difference between the present embodiment and the above embodiment is S307, where S301 to S306 are the same as S101 to S106 in the previous embodiment, and please refer to the related description of S101 to S106 in the above embodiment, which is not repeated herein. The development method of the automated test case in the embodiment may further include:
s307: and the data subset corresponding to the test case comprises expected output data of the test case, and the test case passing the test in the test case is determined according to the expected output data of the test case and test results returned by a plurality of simulation operation ends.
And if the test case passes the test, stopping operation, and if the test case does not pass the test, generating a corresponding prompt carrying the test case information which does not pass the test, so that the requirements of various application scenes are met.
Referring to fig. 4, fig. 4 is a schematic flowchart of a method for developing an automatic test case according to another embodiment of the present invention. The difference between the present embodiment and the above embodiment is S404, where S401 to S403 are the same as S101 to S103 in the previous embodiment, and S405 to S407 are the same as S104 to S106 in the previous embodiment, and specific reference is made to the description related to S101 to S103 and S104 to S106 in the above embodiment, which is not repeated herein. The development method of the automated test case in the embodiment may further include:
s404: and querying the associated data in a database of the production system by taking the input and output data of the page as keywords.
The database of the production system can pre-store the corresponding relation between the input and output data of the page and the associated data, the big data analysis platform obtains the input and output data of the page in the page operation process of the user, the data is used as a keyword to inquire the associated data in the database of the production system, if the data can be inquired, the subsequent steps are executed, and if the data cannot be inquired, a corresponding prompt can be generated, so that the requirements of practical application are met.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Corresponding to the development method of the automated test case described in the above embodiment, fig. 5 shows a schematic block diagram of a development apparatus of an automated test case provided in an embodiment of the present invention. The automatic test case development device 500 of the present embodiment includes units for executing steps in the embodiment corresponding to fig. 1, and please refer to fig. 1 and the related description in the embodiment corresponding to fig. 1 for details, which are not repeated herein. The automatic test case development device 500 of the present embodiment includes a user characteristic information acquisition unit 501, a page operation processing unit 502, a page operation code generation unit 503, a data subset determination unit 504, a use case generation unit 505, and a task distribution unit 506.
The user characteristic information acquiring unit 501 is configured to, if it is monitored that a user logs in a production system, acquire characteristic information of the user according to login information of the user, and monitor whether the user performs a page operation on the production system. The page operation processing unit 502 is configured to record an operation trajectory of the user and obtain input and output data of a page during a page operation performed by the user, if the user performs the page operation in the production system. A page operation code generating unit 503, configured to match the operation trajectory with pre-stored page operation metadata, and generate a page operation code. A data subset determining unit 504, configured to determine a data subset for the user to perform the page operation according to the input and output data of the page and the associated data of the input and output data of the page. The use case generating unit 505 is configured to generate a test case according to the feature information of the user, the page operation code, the data subset, and a preset user operation process, and determine the data subset corresponding to the test case. And the task distribution unit 506 is configured to distribute the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases, and receive the test results returned by the multiple simulation operation ends.
From the above description, it can be seen that the development apparatus for automated test cases in the embodiments of the present invention effectively collects feature information of all users on a production system, operation tracks in a page operation process, and page input/output data, normalizes and processes operation characteristics of different types of users, generates a page operation code in combination with page operation metadata, determines a data subset of the user performing a page operation, and then simulating user operation flows with different scenes and different characteristics to form a test case, deploying the test case and data subsets corresponding to the test case to a plurality of independent simulation operation ends to complete the test work, obtaining a test result, realizing an automatic flow from case generation to execution completion of the test work, and lightening the working strength of testers, thereby achieving the purposes of improving the test efficiency and improving the test effect.
Referring to fig. 6, fig. 6 is a schematic block diagram of another device for developing an automated test case according to another embodiment of the present invention. The automatic test case development device 600 of this embodiment includes a user characteristic information acquisition unit 601, a page operation processing unit 602, a page operation code generation unit 603, a data subset determination unit 604, a case generation unit 605, a task distribution unit 606, a test case detection unit 607, and an associated data query unit 608.
Specifically, please refer to fig. 5 and the related description of the user characteristic information obtaining unit 501, the page operation processing unit 502, the page operation code generating unit 503, the data subset determining unit 504, the use case generating unit 505, and the task distributing unit 506 in the embodiment corresponding to fig. 5, and this is not repeated herein.
Further, the task distributing unit 606 includes a test case required resource determining unit 6061 and a test case distributing unit 6062.
The resource determining unit 6061 required by the test case is configured to determine, according to the test case and the data subset corresponding to the test case, the computation capability and the storage space required by running the test case. The test case distribution unit 6062 is configured to distribute the test cases and the data subsets corresponding to the test cases to corresponding simulation operation ends to execute the test cases according to the operation conditions of the managed multiple simulation operation ends and the computation capability and storage space required by the test cases, and receive the test results returned by the multiple simulation operation ends.
Further, the data subset corresponding to the test case includes expected output data of the test case.
The test case detection unit 607 is configured to determine, according to the expected output data of the test case and the test results returned by the multiple simulation operation ends, a test case that passes the test in the test cases.
Further, the associated data query unit 608 is configured to query the associated data in the database of the production system by using the input and output data of the page as a keyword.
From the above description, the embodiment of the invention collects the page operation and the page data of the user, and automatically generates the test case and the corresponding test data subset of the specific user or the specific scene by combining the big data analysis, so that the test case is not required to be manually generated, and the workload of the tester is reduced. And a plurality of simulation operation ends are managed in a distributed mode, test tasks are automatically distributed to the simulation operation ends to be executed according to the operational capacity and the storage space required by the test cases, the unified management and control distributed management automatic execution of test programs and test data is achieved, the test efficiency is improved, and the resource utilization efficiency is improved by adopting a virtual container technology. Meanwhile, different from one set of test data in the past test environment, the embodiment of the invention provides a plurality of sets of test data according to different test scenes and test cases, and the test data are isolated by different simulation operation ends, so that the test can be simultaneously carried out, the test effect is not influenced, and the test accuracy is improved.
Referring to fig. 7, fig. 7 is a schematic block diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 7, the terminal device 70 of this embodiment includes: a processor 700, a memory 701, and a computer program 702, such as an automated test case development program, stored in the memory 701 and executable on the processor 700. The processor 700, when executing the computer program 702, implements the steps in the various automated test case development method embodiments described above, such as the steps 101 to 106 shown in fig. 1. Alternatively, the processor 700, when executing the computer program 702, implements the functions of the units in the above-described device embodiments, such as the functions of the units 601 to 608 shown in fig. 6.
The computer program 702 may be partitioned into one or more modules/units that are stored in the memory 701 and executed by the processor 700 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 702 in the terminal device 70. For example, the computer program 702 may be divided into a user characteristic information obtaining unit, a page operation processing unit, a page operation code generating unit, a data subset determining unit, a use case generating unit, a task distributing unit, a test case detecting unit, and an associated data querying unit, where the specific functions of each unit are as follows:
if the fact that a user logs in a production system is monitored, acquiring characteristic information of the user according to the login information of the user, and monitoring whether the user performs page operation on the production system;
if the user performs page operation on the production system, recording the operation track of the user, and acquiring input and output data of a page in the process of performing page operation by the user;
matching the operation track with prestored page operation metadata to generate a page operation code;
determining a data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page;
generating a test case according to the characteristic information of the user, the page operation code and the data subset, and a preset user operation process, and determining the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to a plurality of managed simulation operation ends to execute the test cases, and receiving test results returned by the simulation operation ends.
Further, the distributing the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases includes:
determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to the corresponding simulation operation ends to execute the test cases according to the managed operation conditions of the simulation operation ends and the operational capacity and storage space required by the test cases.
Further, the data subset corresponding to the test case comprises expected output data of the test case;
the specific functions of each unit further include:
and determining the test cases which pass the test in the test cases according to the expected output data of the test cases and the test results returned by the plurality of simulation operation ends.
Further, the specific functions of each unit further include:
and querying the associated data in a database of the production system by taking the input and output data of the page as keywords.
According to the scheme, characteristic information of all users on a production system, operation tracks in a page operation process and page input and output data are effectively collected, operation characteristics of different types of users are regularly processed, page operation metadata are combined to generate page operation codes, data subsets of the users for performing page operations are determined, user operation processes with different scenes and different characteristics are simulated to form test cases, the test cases and the data subsets corresponding to the test cases are deployed on a plurality of independent simulation operation terminals, test work is completed, test results are obtained, the automatic process from case generation to execution completion of the test work is achieved, the working strength of testers is reduced, and therefore the purposes of improving test efficiency and improving test effects are achieved.
The terminal device 70 may be a computing device such as a desktop computer, a notebook, a palm computer, and a cloud server. The terminal device may include, but is not limited to, a processor 700, a memory 701. Those skilled in the art will appreciate that fig. 7 is merely an example of a terminal device 70 and does not constitute a limitation of terminal device 70 and may include more or fewer components than shown, or some components may be combined, or different components, for example, the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 700 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 701 may be an internal storage unit of the terminal device 70, such as a hard disk or a memory of the terminal device 70. The memory 701 may also be an external storage device of the terminal device 70, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 70. Further, the memory 701 may also include both an internal storage unit and an external storage device of the terminal device 70. The memory 701 is used for storing the computer program and other programs and data required by the terminal device. The memory 701 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A development method of an automatic test case is characterized by comprising the following steps:
if the fact that a user logs in a production system is monitored, acquiring characteristic information of the user according to the login information of the user, and monitoring whether the user performs page operation on the production system;
if the user performs page operation on the production system, recording the operation track of the user, and acquiring input and output data of a page in the process of performing page operation by the user;
matching the operation track with prestored page operation metadata to generate a page operation code;
determining a data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page;
generating a test case according to the characteristic information of the user, the page operation code and the data subset, and a preset user operation process, and determining the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to a plurality of managed simulation operation ends to execute the test cases, and receiving test results returned by the simulation operation ends.
2. The method for developing an automated test case according to claim 1, wherein the distributing the test case and the data subset corresponding to the test case to the managed multiple simulation operation terminals to execute the test case comprises:
determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case;
and distributing the test cases and the data subsets corresponding to the test cases to the corresponding simulation operation ends to execute the test cases according to the managed operation conditions of the simulation operation ends and the operational capacity and storage space required by the test cases.
3. The method for developing an automated test case according to claim 1, wherein the data subset corresponding to the test case includes expected output data of the test case;
the method further comprises the following steps:
and determining the test cases which pass the test in the test cases according to the expected output data of the test cases and the test results returned by the plurality of simulation operation ends.
4. The method for developing an automated test case of claim 1, further comprising:
and querying the associated data in a database of the production system by taking the input and output data of the page as keywords.
5. An apparatus for developing an automated test case, comprising:
the system comprises a user characteristic information acquisition unit, a production system monitoring unit and a display unit, wherein the user characteristic information acquisition unit is used for acquiring the characteristic information of a user according to the login information of the user and monitoring whether the user performs page operation on the production system if the user logs in the production system;
the page operation processing unit is used for recording the operation track of the user and acquiring the input and output data of a page in the page operation process of the user if the user performs the page operation in the production system;
the page operation code generating unit is used for matching the operation track with pre-stored page operation metadata to generate a page operation code;
the data subset determining unit is used for determining the data subset of the page operation performed by the user according to the input and output data of the page and the associated data of the input and output data of the page;
the case generating unit is used for generating a test case according to the characteristic information of the user, the page operation code and the data subset and a preset user operation process, and determining the data subset corresponding to the test case;
and the task distribution unit is used for distributing the test cases and the data subsets corresponding to the test cases to the managed multiple simulation operation ends to execute the test cases and receiving the test results returned by the multiple simulation operation ends.
6. The apparatus for developing an automated test case according to claim 5, wherein the task distribution unit comprises:
the resource determining unit required by the test case is used for determining the operational capability and the storage space required by the running of the test case according to the test case and the data subset corresponding to the test case;
and the test case distribution unit is used for distributing the test cases and the data subsets corresponding to the test cases to the corresponding simulation operation ends to execute the test cases according to the operation conditions of the plurality of managed simulation operation ends and the operational capacity and the storage space required by the operation of the test cases, and receiving the test results returned by the plurality of simulation operation ends.
7. The apparatus for developing an automated test case according to claim 5, wherein the data subset corresponding to the test case includes expected output data of the test case;
the device further comprises:
and the test case detection unit is used for determining the test cases which pass the test in the test cases according to the expected output data of the test cases and the test results returned by the plurality of simulation operation ends.
8. The apparatus for developing an automated test case of claim 5, further comprising:
and the associated data query unit is used for querying the associated data in a database of the production system by taking the input and output data of the page as a keyword.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN201810658805.3A 2018-06-25 2018-06-25 Development method of automatic test case and terminal equipment Active CN109062780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810658805.3A CN109062780B (en) 2018-06-25 2018-06-25 Development method of automatic test case and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810658805.3A CN109062780B (en) 2018-06-25 2018-06-25 Development method of automatic test case and terminal equipment

Publications (2)

Publication Number Publication Date
CN109062780A CN109062780A (en) 2018-12-21
CN109062780B true CN109062780B (en) 2021-08-17

Family

ID=64821012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810658805.3A Active CN109062780B (en) 2018-06-25 2018-06-25 Development method of automatic test case and terminal equipment

Country Status (1)

Country Link
CN (1) CN109062780B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109828914A (en) * 2018-12-28 2019-05-31 宁波瓜瓜农业科技有限公司 Whole process distributed system automated testing method and test macro
CN111258879A (en) * 2019-03-25 2020-06-09 深圳市远行科技股份有限公司 Service test scheduling method and device based on page acquisition and intelligent terminal
CN110597730B (en) * 2019-09-20 2023-08-22 中国工商银行股份有限公司 Automatic test case generation method and system based on scene method
CN110765026B (en) * 2019-10-31 2023-08-01 望海康信(北京)科技股份公司 Automatic test method, device, storage medium and equipment
CN112783754A (en) * 2019-11-07 2021-05-11 北京沃东天骏信息技术有限公司 Method and device for testing page
CN111104324B (en) * 2019-12-17 2023-08-18 广州品唯软件有限公司 Method and device for generating test cases and computer readable storage medium
CN111177623A (en) * 2019-12-23 2020-05-19 北京健康之家科技有限公司 Information processing method and device
CN111367791B (en) * 2020-02-19 2023-08-01 北京字节跳动网络技术有限公司 Method, device, medium and electronic equipment for generating test case
CN112699040B (en) * 2020-12-30 2024-02-23 深圳前海微众银行股份有限公司 Pressure testing method, device, equipment and computer readable storage medium
CN113342629B (en) * 2021-06-08 2023-03-07 微民保险代理有限公司 Operation track restoration method and device, computer equipment and storage medium
CN117312161A (en) * 2023-10-07 2023-12-29 中国通信建设集团有限公司数智科创分公司 Intelligent detection system and method based on automatic login technology

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123227A (en) * 2014-08-13 2014-10-29 广东电网公司信息中心 Method for automatically generating testing cases
CN107943683A (en) * 2017-10-30 2018-04-20 北京奇虎科技有限公司 A kind of test script generation method, device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9734045B2 (en) * 2015-02-20 2017-08-15 Vmware, Inc. Generating test cases
IN2015DE01395A (en) * 2015-05-18 2015-06-26 Hcl Technologies Ltd
US20170192882A1 (en) * 2016-01-06 2017-07-06 Hcl Technologies Limited Method and system for automatically generating a plurality of test cases for an it enabled application

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123227A (en) * 2014-08-13 2014-10-29 广东电网公司信息中心 Method for automatically generating testing cases
CN107943683A (en) * 2017-10-30 2018-04-20 北京奇虎科技有限公司 A kind of test script generation method, device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种自动生成软件测试用例的新方法;董昕;《计算机应用与软件》;20171031;第34卷(第10期);全文 *

Also Published As

Publication number Publication date
CN109062780A (en) 2018-12-21

Similar Documents

Publication Publication Date Title
CN109062780B (en) Development method of automatic test case and terminal equipment
US8719784B2 (en) Assigning runtime artifacts to software components
CN110321113B (en) Integrated assembly line system taking project batches as standards and working method thereof
EP2572294B1 (en) System and method for sql performance assurance services
CN108628748B (en) Automatic test management method and automatic test management system
CN105095059A (en) Method and device for automated testing
CN110737689B (en) Data standard compliance detection method, device, system and storage medium
CN110956269A (en) Data model generation method, device, equipment and computer storage medium
CN108830383B (en) Method and system for displaying machine learning modeling process
CN104993962A (en) Method and system for obtaining use state of terminal
CN105095207A (en) Methods for retrieving and obtaining contents of application software, and devices for retrieving and obtaining contents of application software
CN108763091A (en) Method, apparatus and system for regression test
CN106682910B (en) Information processing method, system and related equipment
US20210124752A1 (en) System for Data Collection, Aggregation, Storage, Verification and Analytics with User Interface
CN111666201A (en) Regression testing method, device, medium and electronic equipment
CN114238048B (en) Automatic testing method and system for Web front-end performance
CN113672497B (en) Method, device and equipment for generating non-buried point event and storage medium
CN113806231A (en) Code coverage rate analysis method, device, equipment and medium
CN113342632A (en) Simulation data automatic processing method and device, electronic equipment and storage medium
CN113448867A (en) Software pressure testing method and device
CN107506299B (en) Code analysis method and terminal equipment
CN111933228A (en) Method and device for realizing project distribution and management system in clinical research
CN111045983A (en) Nuclear power station electronic file management method and device, terminal equipment and medium
CN114691837B (en) Insurance business data processing method and processing system based on big data
US8949819B2 (en) Rationalizing functions to identify re-usable services

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant