CN110764999A - Automatic testing method and device, computer device and storage medium - Google Patents

Automatic testing method and device, computer device and storage medium Download PDF

Info

Publication number
CN110764999A
CN110764999A CN201910843250.4A CN201910843250A CN110764999A CN 110764999 A CN110764999 A CN 110764999A CN 201910843250 A CN201910843250 A CN 201910843250A CN 110764999 A CN110764999 A CN 110764999A
Authority
CN
China
Prior art keywords
task
parameter value
test
rule information
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910843250.4A
Other languages
Chinese (zh)
Inventor
刘芳
刘丽珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
Original Assignee
OneConnect Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Smart Technology Co Ltd filed Critical OneConnect Smart Technology Co Ltd
Priority to CN201910843250.4A priority Critical patent/CN110764999A/en
Publication of CN110764999A publication Critical patent/CN110764999A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention provides an automatic testing method, which comprises the following steps: generating a configuration file for inserting the task rule information into a database, and executing the configuration file to insert the task rule information into a preset database; executing a configuration file to randomly generate first parameter value data corresponding to each test task and sending the first parameter value data to a server; selecting a test task, and reading task rule information corresponding to the selected test task from the database; after first parameter value data corresponding to the selected test task is obtained from the server, a trial calculation interface of an application program to be tested is called, and a second parameter value corresponding to the first parameter value data is calculated according to the task rule information read from the database; and comparing the second parameter value obtained by calculation with a preset value, and judging whether the test is passed. The invention also provides an automatic testing device, a computer device and a readable storage medium. The invention improves the testing efficiency.

Description

Automatic testing method and device, computer device and storage medium
Technical Field
The invention relates to the technical field of computers, in particular to an automatic testing method, an automatic testing device, a computer device and a computer readable storage medium.
Background
After the computer software is developed, the problems of the software can be found only through testing, so that the software is continuously improved. However, some software tests require inputting a large amount of test data and importing the test data into a predetermined database, and if the conventional manual tests are performed, the time consumption is high, the regression cycle is too long, and the software development efficiency is affected. For example, a performance grading reward system of business personnel is a system frequently used by enterprises, a great amount of data of business personnel for completing performance needs to be input in the testing process of an application program for automatically calculating grading reward according to the business personnel, and the efficiency is low if the testing is received, so that an automatic testing method needs to be provided, and the testing efficiency is improved.
Disclosure of Invention
In view of the foregoing, there is a need for an automated testing method, apparatus, computer apparatus and computer readable storage medium, which can improve the efficiency of automated testing.
A first aspect of the present application provides an automated testing method, the method comprising:
generating a configuration file for inserting task rule information into a database, acquiring task rule information corresponding to at least one test task, importing and executing the configuration file to insert the task rule information into a preset database;
configuring a configuration file for randomly generating first parameter value data corresponding to each test task, and generating the first parameter value data corresponding to each test task by executing the configuration file;
establishing connection with a server, and sending the generated first parameter value data corresponding to each test task to the server;
selecting a test task, and reading task rule information corresponding to the selected test task from the database;
after first parameter value data corresponding to the selected test task is obtained from the server, a trial calculation interface of an application program to be tested is called, and a second parameter value corresponding to the first parameter value data is calculated according to the task rule information read from the database;
and comparing the second parameter value obtained by calculation with a preset value, judging whether the second parameter value is consistent with the preset value, if so, generating test passing confirmation information, and if not, generating test failure confirmation information.
Preferably, the task rule information is a file in an Excel form, and the method for inserting the content of the file in the Excel form into the database includes:
acquiring a path of local Excel and opening the Excel file;
inquiring the database object field, and automatically adding a field row corresponding to the database object field in the Excel file;
circularly acquiring all worksheets in the Excel, and circularly analyzing the content of each line in the Excel;
and splicing SQL sentences, namely splicing newly added field lines in Excel by using insert sentences, splicing the table names of the tables to be imported, circulating each line of Excel, splicing the data analyzed by each line, and inserting the data into a database through the SQL sentences.
The selection test task is determined according to enterprise operation data related to each task, and comprises the following steps:
reading enterprise operation data values related to each task from an enterprise system platform;
calculating a weight value corresponding to each parameter in the enterprise operation parameter values, multiplying all the parameter values corresponding to each task by the corresponding weights, and adding the multiplied values to obtain a score corresponding to each task;
arranging the tasks according to the scores;
and sequentially reading task rule information corresponding to the tasks according to the arrangement sequence of the tasks.
Preferably, the method may further select the test task by parsing a configuration file.
Preferably, the test task rule information includes task identification information, and the test task is identified by the task identification information.
Preferably, the first parameter value is data of the work task completion amount of the worker, the second parameter value is an award value corresponding to the work task completion amount, the test task rule information further includes a calculation formula and an association rule, and the calculation formula is: the second parameter value is the first parameter value, and the association rule is the corresponding relationship between different ranges of the first parameter value and different weighting coefficients.
Preferably, the method comprises: and establishing Socket connection with the server, and sending the first parameter value corresponding to the generated test task to the server through the Socket connection.
A second aspect of the present application provides an automated testing apparatus, the apparatus comprising:
the task rule information importing module is used for generating a configuration file for inserting the task rule information into a database, acquiring task rule information corresponding to at least one test task, and importing and executing the configuration file to insert the task rule information into a preset database;
the parameter generating module is used for configuring a configuration file for randomly generating first parameter value data corresponding to each test task and generating the first parameter value data corresponding to each test task by executing the configuration file;
the data transmission module is used for establishing connection with the server and sending the generated first parameter value data corresponding to each test task to the server;
the task selection module is used for selecting a test task and reading task rule information corresponding to the selected test task from the database;
the calling module is used for calling a trial calculation interface of an application program to be tested after first parameter value data corresponding to the selected test task is obtained from the server, and calculating a second parameter value corresponding to the first parameter value data according to the task rule information read from the database;
and the determining module is used for comparing the calculated second parameter value with a preset value, judging whether the second parameter value is consistent with the preset value or not, if so, generating test passing confirmation information, and if not, generating test failure confirmation information.
A third aspect of the application provides a computer apparatus comprising a processor for implementing the automated testing method as described above when executing a computer program stored in a memory.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program characterized in that: which computer program, when being executed by a processor, carries out the automated testing method as described above.
The automatic testing method provided by the invention can automatically test the application program, and the testing efficiency is improved. Furthermore, the method for inserting the excel file into the database can more accurately and quickly insert the file into the preset database, and can further improve the automatic testing efficiency.
Drawings
Fig. 1 is a schematic diagram of an application environment architecture of an automated testing method according to an embodiment of the present invention.
Fig. 2 is a flowchart of an automated testing method according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 is a schematic diagram of an application environment architecture of an automated testing method according to an embodiment of the present invention. The automatic testing method in the invention is applied to a communication system comprising a computer device 1 and a server 2. The computer device 1 and the server 2 establish communication connection in a wired or wireless mode. In the present embodiment, the computer device 1 may be, but is not limited to, an electronic device such as a desktop computer, a notebook computer, and a kiosk. The server may be a single server, a cluster of servers, a cloud server, or the like.
Fig. 2 is a flowchart of an automated testing method according to an embodiment of the present invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S11, generating a configuration file for inserting the task rule information into the database, acquiring the task rule information corresponding to the test task, and importing and executing the configuration file to insert the task rule information into a preset database. The task rule information at least comprises task identification information (called task ID below), a task name and a calculation rule, and the calculation rule is used for calculating a second parameter value based on a first parameter value.
The awarding of the corresponding prizes in dependence upon the performance of the employee in completing the work task is a commonly used means in the enterprise operation and therefore in one embodiment of the invention the application to be tested may be a prize value calculation system for rating the prizes for the employee in dependence upon the credited performance of the employee. The method is used for testing whether the calculation result of the reward value calculation system is accurate.
In an embodiment of the present invention, the number of the test tasks may be multiple, and the content and the task rule information of each task are different, so in an embodiment, the task ID is a unique identifier of each task and is used for identifying the identity information of each test task, and the task ID may be an identifier such as a number, a letter, a symbol, and a combination thereof.
In an embodiment of the present invention, the first parameter value is data of a work task completion amount of a worker, and the second parameter value is an award value corresponding to the work task completion amount.
The calculation rules are one or more calculation formulas between the work task completion amount and the reward value, and further, the calculation rules may further include association rules corresponding to the calculation formulas, and the association rules are used for defining the characteristic measurement value range in the calculation formulas.
Specifically, in one embodiment, the calculation formula included in the calculation rule and the association rule of the calculation formula are as follows:
the calculation formula is as follows: the second parameter value is the first parameter value and is a weight coefficient;
the association rule is a corresponding relation between different first parameter value ranges and different weight coefficients, for example, when the first parameter value is in a first preset range, the weight coefficient is a first value; and when the first parameter value is in a second preset range, the weight coefficient is a second value, and so on.
For example, the second parameter value is a monthly reward value corresponding to the lender, the first parameter is a total amount of the loan, and the weight coefficient is an additional reward coefficient, that is, the formula specifically includes: a monthly reward value is total payout and an additional reward coefficient, wherein the association rule is as follows: when the total amount of the deposited money is in the range of 0-20 ten thousand, the additional reward coefficient is 0; when the total amount of the paid money is in a value range of 20-50 ten thousand, the extra reward coefficient is 0.1%, and when the total amount of the paid money is in a value range of more than 50 ten thousand, the extra reward coefficient is 0.2%.
In an embodiment, the task rule information is a file in an Excel format. The method for inserting the content of the Excel form file into the preset database comprises the following steps:
(1) acquiring a path (such as D: \ \ Excel. xls) of local Excel and opening the Excel file;
(2) inquiring the database object field, and automatically adding a field row corresponding to the database object field in the Excel file;
(3) circularly acquiring all worksheets in the Excel, and circularly analyzing the content of each line in the Excel;
(4) and splicing SQL sentences, namely splicing newly added field lines in Excel by using insert sentences, splicing the table names of the tables to be imported, circulating each line of Excel, splicing the data analyzed by each line, and inserting the data into a database through the SQL sentences to finish insertion.
By the file inserting method, the excel file can be inserted into the database more quickly and accurately.
Step S12, configuring a configuration file for randomly generating first parameter value data corresponding to each test task, and generating the first parameter value data corresponding to each test task by executing the configuration file.
According to the invention, the first parameter value data corresponding to the test task is randomly generated through the configuration file, and the test data does not need to be manually collected and imported, so that the test efficiency is improved.
In one embodiment, the first parameter value data includes staff identity information, a task name executed by a staff, and work task completion amount data corresponding to the staff. For example, the worker identity information in the first parameter value data includes the name and number of the worker, the name of the task executed by the worker may be the loan, and the work task completion amount data is the total loan amount of the worker.
In order to cover a plurality of task situations, the first parameter value data of each test task comprises a plurality of groups of data. In one embodiment, the plurality of sets of first parameter value data are divided by taking tasks as dimensions, that is, a first task corresponds to a first set of parameter value data, and a second task corresponds to a second set of parameter value data. For example, the first set of first parameter value data may include a plurality of forms, e.g., a first form including individual worker identity information and work task completion amount data; the second form comprises identity information of a plurality of workers and work task completion amount data; the third form comprises a plurality of staff identity information, and each staff corresponds to the multiple task completion amount data.
In an embodiment, the first parameter value data is a file in an Excel format.
And step S13, establishing connection with the server, and sending the generated first parameter value data corresponding to each test task to the server.
In some embodiments of the present invention, a Socket connection is established with the server, and the first parameter value corresponding to the generated test task is sent to the server through the Socket connection.
And step S14, selecting the test task, and reading task rule information corresponding to the selected test task from the database.
In one embodiment, the selecting of the test task is determined by analyzing the enterprise operation data related to each task, and specifically includes the following steps:
(1) reading enterprise business data values related to each task from the enterprise system platform, wherein the data fields read from the enterprise system platform include, but are not limited to, the revenue amount generated by each task within a preset historical time, the total number of strokes corresponding to each task, a historical growth trend value of each task (for example, a revenue amount per month is a growth-on-year value), the number of times each task is mentioned in an enterprise meeting record or an enterprise publicity page, and the like;
(2) calculating a weight value corresponding to each parameter in the enterprise operation parameter values, multiplying all the parameter values corresponding to each task by the corresponding weights, and adding the multiplied values to obtain a score corresponding to each task;
(3) arranging the tasks according to scores, for example, the tasks are arranged from high to low;
(4) and sequentially reading task rule information corresponding to the tasks according to the arrangement sequence of the tasks.
And searching the task rule information corresponding to the selected test task from the database through the task ID.
The method for selecting the test tasks has the advantages that the test tasks can be intelligently sequenced, and the tasks which are high in revenue, large in traffic, good in development trend and valued by enterprises in enterprise operation can be analyzed through various parameters in enterprise operation data, so that the tasks are preferentially tested, the tasks which are low in revenue, small in traffic and poor in development trend are sequenced at the back for testing, and the tasks with high priority are intelligently selected for testing under the conditions of tension in the test tasks and high labor cost.
In another embodiment, the test task is set in a configuration file, and the test task is determined by parsing the configuration file. Specifically, the configuration file includes a task ID of the test task, and the task rule information corresponding to the task ID is read from the database by obtaining the task ID in the configuration file.
Wherein the configuration file is a Properties file.
In other embodiments, the tasks may be read sequentially according to an original arrangement order of the tasks in the database, for example, according to a time order established by the tasks.
Step S15, after the first parameter value data corresponding to the selected test task is obtained from the server, a trial calculation interface of the application program to be tested is called, and a second parameter value corresponding to the first parameter value data is calculated according to the task rule information read from the database.
Specifically, the second parameter value is calculated for each worker according to the identity information of the worker in the work task.
In this embodiment, the method further includes: and summing the data with the same identity information of the workers in different tasks to obtain the total number of the second parameter values corresponding to the identity information of the workers.
For example, the total amount of the identity information and the monthly loan of the worker a, which is acquired from the server, is 40 ten thousand, and according to the acquired task rule information: the monthly reward value is the total payout amount and the additional reward coefficient, wherein when the association rule is that the total payout amount is in a value range of 20-50 ten thousand, the additional reward coefficient is 0.1%, and the monthly reward value of the worker A is 40000 through calculation.
Step S16, comparing the calculated second parameter value with a preset value, judging whether the second parameter value is consistent with the preset value, if so, executing step S17: test pass confirmation information is generated, and if not, step S18 is executed: test failure confirmation information is generated.
And in the execution process, the second parameter value in the preset file is acquired as the preset value and is compared with the calculated second parameter value, so that whether the system passes the test is judged, if the result calculated by the system is consistent with the manual calculation result, the second parameter value calculated by the system is proved to be correct, the test is passed, and if the result is inconsistent, the system calculates data wrongly and the test is failed.
The above-mentioned fig. 2 describes the automatic testing method of the present invention in detail, and the functional modules of the software device for implementing the automatic testing method and the hardware device architecture for implementing the automatic testing method are described below with reference to fig. 3-4.
It is to be understood that the embodiments are illustrative only and that the scope of the claims is not limited to this configuration.
FIG. 3 is a block diagram of an automatic test apparatus according to a preferred embodiment of the present invention.
In some embodiments, the automated test equipment 10 runs in a computer device. The automatic test device 10 may include a plurality of functional modules composed of program code segments. Program code for various program segments in the automated test equipment 10 may be stored in a memory of a computer device and executed by the at least one processor to implement automated test functions.
In this embodiment, the automatic test device 10 may be divided into a plurality of functional modules according to the functions performed by the automatic test device. Referring to fig. 3, the functional modules may include: the task rule information input module 101, the parameter generation module 102, the data transmission module 103, the task selection module 104, the calling module 105, and the determination module 106. The module referred to herein is a series of computer program segments capable of being executed by at least one processor and capable of performing a fixed function and is stored in memory.
In the present embodiment, the functions of the modules will be described in detail in the following embodiments.
The task rule information import module 101 is configured to generate a configuration file for inserting task rule information into a database, acquire task rule information corresponding to a test task, import and execute the configuration file, and insert the task rule information into a preset database. The task rule information at least comprises task identification information (called task ID below), a task name and a calculation rule, and the calculation rule is used for calculating a second parameter value based on a first parameter value.
The awarding of the corresponding prizes in dependence upon the performance of the employee in completing the work task is a commonly used means in the enterprise operation and therefore in one embodiment of the invention the application to be tested may be a prize value calculation system for rating the prizes for the employee in dependence upon the credited performance of the employee. The method is used for testing whether the calculation result of the reward value calculation system is accurate.
In an embodiment of the present invention, the number of the test tasks may be multiple, and the content and the task rule information of each task are different, so in an embodiment, the task ID is a unique identifier of each task and is used for identifying the identity information of each test task, and the task ID may be an identifier such as a number, a letter, a symbol, and a combination thereof.
In an embodiment of the present invention, the first parameter value is data of a work task completion amount of a worker, and the second parameter value is an award value corresponding to the work task completion amount.
The calculation rules are one or more calculation formulas between the work task completion amount and the reward value, and further, the calculation rules may further include association rules corresponding to the calculation formulas, and the association rules are used for defining the characteristic measurement value range in the calculation formulas.
Specifically, in one embodiment, the calculation formula included in the calculation rule and the association rule of the calculation formula are as follows:
the calculation formula is as follows: the second parameter value is the first parameter value and is a weight coefficient;
the association rule is a corresponding relation between different first parameter value ranges and different weight coefficients, for example, when the first parameter value is in a first preset range, the weight coefficient is a first value; and when the first parameter value is in a second preset range, the weight coefficient is a second value, and so on.
For example, the second parameter value is a monthly reward value corresponding to the lender, the first parameter is a total amount of the loan, and the weight coefficient is an additional reward coefficient, that is, the formula specifically includes: a monthly reward value is total payout and an additional reward coefficient, wherein the association rule is as follows: when the total amount of the deposited money is in the range of 0-20 ten thousand, the additional reward coefficient is 0; when the total amount of the paid money is in a value range of 20-50 ten thousand, the extra reward coefficient is 0.1%, and when the total amount of the paid money is in a value range of more than 50 ten thousand, the extra reward coefficient is 0.2%.
In an embodiment, the task rule information is a file in an Excel format. The method for inserting the content of the Excel form file into the database by the task rule information import module 101 comprises the following steps:
(1) acquiring a path (such as D: \ \ Excel. xls) of local Excel and opening the Excel file;
(2) inquiring the database object field, and automatically adding a field row corresponding to the database object field in the Excel file;
(3) circularly acquiring all worksheets in the Excel, and circularly analyzing the content of each line in the Excel;
(4) and splicing SQL sentences, namely splicing newly added field lines in Excel by using insert sentences, splicing the table names of the tables to be imported, circulating each line of Excel, splicing the data analyzed by each line, and inserting the data into a database through the SQL sentences to finish insertion.
By the file inserting method, the excel file can be inserted into the database more quickly and accurately.
The parameter generating module 102 is configured to configure a configuration file for randomly generating first parameter value data corresponding to each test task, and generate the first parameter value data corresponding to each test task by executing the configuration file.
According to the invention, the first parameter value data corresponding to the test task is randomly generated through the configuration file, and the test data does not need to be manually collected and imported, so that the test efficiency is improved.
In one embodiment, the first parameter value data includes staff identity information, a task name executed by a staff, and work task completion amount data corresponding to the staff. For example, the worker identity information in the first parameter value data includes the name and number of the worker, the name of the task executed by the worker may be the loan, and the work task completion amount data is the total loan amount of the worker.
In order to cover a plurality of task situations, the first parameter value data of each test task comprises a plurality of groups of data. In one embodiment, the plurality of sets of first parameter value data are divided by taking tasks as dimensions, that is, a first task corresponds to a first set of parameter value data, and a second task corresponds to a second set of parameter value data. For example, the first set of first parameter value data may include a plurality of forms, e.g., a first form including individual worker identity information and work task completion amount data; the second form comprises identity information of a plurality of workers and work task completion amount data; the third form comprises a plurality of staff identity information, and each staff corresponds to the multiple task completion amount data.
In an embodiment, the first parameter value data is a file in an Excel format.
The data transmission module 103 is configured to establish a connection with a server, and send the generated first parameter value data corresponding to each test task to the server.
In some embodiments of the present invention, the data transmission module 103 establishes a Socket connection with the server, and sends the first parameter value corresponding to the generated test task to the server through the Socket connection.
The task selection module 104 is configured to select a test task and read task rule information corresponding to the selected test task from the database.
In one embodiment, the task selection module 104 selects the test task to be analyzed and determined according to the enterprise business data related to each task, and specifically includes the following steps:
(1) reading enterprise business data values related to each task from the enterprise system platform, wherein the data fields read from the enterprise system platform include, but are not limited to, the revenue amount generated by each task within a preset historical time, the total number of strokes corresponding to each task, a historical growth trend value of each task (for example, a revenue amount per month is a growth-on-year value), the number of times each task is mentioned in an enterprise meeting record or an enterprise publicity page, and the like;
(2) calculating a weight value corresponding to each parameter in the enterprise operation parameter values, multiplying all the parameter values corresponding to each task by the corresponding weights, and adding the multiplied values to obtain a score corresponding to each task;
(3) arranging the tasks according to scores, for example, the tasks are arranged from high to low;
(4) and sequentially reading task rule information corresponding to the tasks according to the arrangement sequence of the tasks.
And searching the task rule information corresponding to the selected test task from the database through the task ID.
The method for selecting the test tasks has the advantages that the test tasks can be intelligently sequenced, and the tasks which are high in revenue, large in traffic, good in development trend and valued by enterprises in enterprise operation can be analyzed through various parameters in enterprise operation data, so that the tasks are preferentially tested, the tasks which are low in revenue, small in traffic and poor in development trend are sequenced at the back for testing, and the tasks with high priority are intelligently selected for testing under the conditions of tension in the test tasks and high labor cost.
In another embodiment, the test task is set in a configuration file, and the task selection module 104 determines the test task by analyzing the configuration file. Specifically, the configuration file includes a task ID of the test task, and the task rule information corresponding to the task ID is read from the database by obtaining the task ID in the configuration file.
Wherein the configuration file is a Properties file.
In other embodiments, the task selecting module 104 may further sequentially read the tasks in the original arrangement order of the tasks in the database, for example, sequentially read the tasks according to the time sequence established by the tasks.
The calling module 105 is configured to call a trial calculation interface of an application program to be tested after first parameter value data corresponding to the selected test task is acquired from the server, and calculate a second parameter value corresponding to the first parameter value data according to the task rule information read from the database.
Specifically, the calling module 105 calculates the second parameter value for each worker according to the identity information of the worker in the work task.
In this embodiment, the invoking module 105 is further configured to: and summing the data with the same identity information of the workers in different tasks to obtain the total number of the second parameter values corresponding to the identity information of the workers.
For example, the total amount of the identity information and the monthly loan of the worker a, which is acquired from the server, is 40 ten thousand, and according to the acquired task rule information: the monthly reward value is the total payout amount and the additional reward coefficient, wherein when the association rule is that the total payout amount is in a value range of 20-50 ten thousand, the additional reward coefficient is 0.1%, and the monthly reward value of the worker A is 40000 through calculation.
The determining module 106 is configured to compare the calculated second parameter value with a preset value, determine whether the second parameter value is consistent with the preset value, if so, generate a test passing confirmation message, and if not, generate a test failing confirmation message.
And in the execution process, the second parameter value in the preset file is acquired as the preset value and is compared with the calculated second parameter value, so that whether the system passes the test is judged, if the result calculated by the system is consistent with the manual calculation result, the second parameter value calculated by the system is proved to be correct, the test is passed, and if the result is inconsistent, the system calculates data wrongly and the test is failed.
Example four
FIG. 4 is a diagram of a computer device according to a preferred embodiment of the present invention.
The computer device 1 comprises a memory 20, a processor 30 and a computer program 40, such as an automated test program, stored in the memory 20 and executable on the processor 30. The processor 30, when executing the computer program 40, implements the steps of the above-described automated testing method embodiments, such as the steps S11-S18 shown in fig. 2. Alternatively, the processor 30, when executing the computer program 40, implements the functions of the modules/units in the above-mentioned embodiment of the automatic testing device, such as the module 101 and 106 in fig. 3.
Illustratively, the computer program 40 may be partitioned into one or more modules/units that are stored in the memory 20 and executed by the processor 30 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, the instruction segments describing the execution process of the computer program 40 in the computer apparatus 1. For example, the computer program 40 may be divided into modules in fig. 3.
The computer device 1 may be a desktop computer, a cloud server, or other computing devices. It will be appreciated by a person skilled in the art that the schematic diagram is merely an example of the computer apparatus 1, and does not constitute a limitation of the computer apparatus 1, and may comprise more or less components than those shown, or some components may be combined, or different components, for example, the computer apparatus 1 may further comprise an input and output device, a network access device, a bus, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. The general purpose processor may be a microprocessor or the processor 30 may be any conventional processor or the like, the processor 30 being the control center of the computer device 1, various interfaces and lines connecting the various parts of the overall computer device 1.
The memory 20 may be used for storing the computer program 40 and/or the module/unit, and the processor 30 implements various functions of the computer device 1 by running or executing the computer program and/or the module/unit stored in the memory 20 and calling data stored in the memory 20. The memory 20 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data) created according to the use of the computer apparatus 1, and the like. Further, the memory 20 may include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a flash memory Card (FlashCard), at least one magnetic disk storage device, a flash memory device, or other non-volatile solid state storage device.
It is understood that the computer apparatus and method disclosed in the several embodiments of the present invention can be implemented in other ways. For example, the above-described embodiments of the computer apparatus are merely illustrative, and for example, the division of the units is only one logical function division, and there may be other divisions when the actual implementation is performed.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. An automated testing method for testing an application, the method comprising:
generating a configuration file for inserting task rule information into a database, acquiring task rule information corresponding to at least one test task, importing and executing the configuration file to insert the task rule information into a preset database;
configuring a configuration file for randomly generating first parameter value data corresponding to each test task, and generating the first parameter value data corresponding to each test task by executing the configuration file;
establishing connection with a server, and sending the generated first parameter value data corresponding to each test task to the server;
selecting a test task, and reading task rule information corresponding to the selected test task from the database;
after first parameter value data corresponding to the selected test task is obtained from the server, a trial calculation interface of an application program to be tested is called, and a second parameter value corresponding to the first parameter value data is calculated according to the task rule information read from the database;
and comparing the second parameter value obtained by calculation with a preset value, judging whether the second parameter value is consistent with the preset value, if so, generating test passing confirmation information, and if not, generating test failure confirmation information.
2. The automated testing method according to claim 1, wherein the task rule information is a file in an Excel form, and the method of inserting the content of the file in the Excel form into a preset database comprises:
acquiring a path of local Excel and opening the Excel file;
inquiring the database object field, and automatically adding a field row corresponding to the database object field in the Excel file;
circularly acquiring all worksheets in the Excel, and circularly analyzing the content of each line in the Excel;
and splicing SQL sentences, namely splicing newly added field lines in Excel by using insert sentences, splicing the table names of the tables to be imported, circulating each line of Excel, splicing the data analyzed by each line, and inserting the data into a database through the SQL sentences.
3. The automated testing method of claim 1, wherein the selecting test tasks is determined based on business operations data associated with each task, comprising:
reading enterprise operation data values related to each task from an enterprise system platform;
calculating a weight value corresponding to each parameter in the enterprise operation parameter values, multiplying all the parameter values corresponding to each task by the corresponding weights, and adding the multiplied values to obtain a score corresponding to each task;
arranging the tasks according to the scores;
and sequentially reading task rule information corresponding to the tasks according to the arrangement sequence of the tasks.
4. The automated testing method of claim 1, wherein the test task is selected by parsing a configuration file.
5. The automated testing method of claim 1, wherein the test task rule information includes task identification information by which the test task is identified.
6. The automated testing method of claim 1, wherein the first parameter value is work task completion amount data of a worker, the second parameter value is an award value corresponding to the work task completion amount, the test task rule information further includes a calculation formula and an association rule, and the calculation formula is: the second parameter value is the first parameter value, and the association rule is the corresponding relationship between different ranges of the first parameter value and different weighting coefficients.
7. The automated testing method of claim 1, wherein the method further comprises: and establishing Socket connection with the server, and sending the first parameter value corresponding to the generated test task to the server through the Socket connection.
8. An automated testing apparatus, the apparatus comprising:
the task rule information importing module is used for generating a configuration file for inserting the task rule information into a database, acquiring task rule information corresponding to at least one test task, and importing and executing the configuration file to insert the task rule information into a preset database;
the parameter generating module is used for configuring a configuration file for randomly generating first parameter value data corresponding to each test task and generating the first parameter value data corresponding to each test task by executing the configuration file;
the data transmission module is used for establishing connection with the server and sending the generated first parameter value data corresponding to each test task to the server;
the task selection module is used for selecting a test task and reading task rule information corresponding to the selected test task from the database;
the calling module is used for calling a trial calculation interface of an application program to be tested after first parameter value data corresponding to the selected test task is obtained from the server, and calculating a second parameter value corresponding to the first parameter value data according to the task rule information read from the database;
and the determining module is used for comparing the calculated second parameter value with a preset value, judging whether the second parameter value is consistent with the preset value or not, if so, generating test passing confirmation information, and if not, generating test failure confirmation information.
9. A computer device, characterized by: the computer arrangement comprises a processor for implementing the automated testing method of any one of claims 1-7 when executing a computer program stored in a memory.
10. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implements an automated testing method according to any one of claims 1-7.
CN201910843250.4A 2019-09-06 2019-09-06 Automatic testing method and device, computer device and storage medium Pending CN110764999A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910843250.4A CN110764999A (en) 2019-09-06 2019-09-06 Automatic testing method and device, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910843250.4A CN110764999A (en) 2019-09-06 2019-09-06 Automatic testing method and device, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN110764999A true CN110764999A (en) 2020-02-07

Family

ID=69330802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910843250.4A Pending CN110764999A (en) 2019-09-06 2019-09-06 Automatic testing method and device, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN110764999A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112597000A (en) * 2020-12-09 2021-04-02 山东浪潮通软信息科技有限公司 WebScoket protocol-based automatic testing method for instant messaging software
CN112783799A (en) * 2021-03-19 2021-05-11 中国工商银行股份有限公司 Software daemon test method and device
CN113592305A (en) * 2021-07-29 2021-11-02 北京百度网讯科技有限公司 Test method, test device, electronic device, and storage medium
CN114338489A (en) * 2021-12-29 2022-04-12 深圳市捷视飞通科技股份有限公司 Automatic testing method, device, equipment and storage medium for multimedia conference system
WO2023045209A1 (en) * 2021-09-27 2023-03-30 Medtrum Technologies Inc. Analyte detection device and detection method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6138112A (en) * 1998-05-14 2000-10-24 Microsoft Corporation Test generator for database management systems
WO2001022214A2 (en) * 1999-09-20 2001-03-29 Lombardi Software, Inc. System, method, signal and software for implementing system undercover agents
WO2011116471A1 (en) * 2010-03-25 2011-09-29 Mga Aq Inc. Method and system for generating updated test data
US20180329812A1 (en) * 2017-05-15 2018-11-15 Bank Of America Corporation Conducting Automated Software Testing Using Centralized Controller And Distributed Test Host Servers
CN109240915A (en) * 2018-08-14 2019-01-18 平安普惠企业管理有限公司 System detection method, device, computer equipment and storage medium
US20190087315A1 (en) * 2017-09-20 2019-03-21 Sap Se Flaky test systems and methods
CN109828903A (en) * 2018-12-14 2019-05-31 中国平安人寿保险股份有限公司 Automated testing method, device, computer installation and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6138112A (en) * 1998-05-14 2000-10-24 Microsoft Corporation Test generator for database management systems
WO2001022214A2 (en) * 1999-09-20 2001-03-29 Lombardi Software, Inc. System, method, signal and software for implementing system undercover agents
WO2011116471A1 (en) * 2010-03-25 2011-09-29 Mga Aq Inc. Method and system for generating updated test data
US20180329812A1 (en) * 2017-05-15 2018-11-15 Bank Of America Corporation Conducting Automated Software Testing Using Centralized Controller And Distributed Test Host Servers
US20190087315A1 (en) * 2017-09-20 2019-03-21 Sap Se Flaky test systems and methods
CN109240915A (en) * 2018-08-14 2019-01-18 平安普惠企业管理有限公司 System detection method, device, computer equipment and storage medium
CN109828903A (en) * 2018-12-14 2019-05-31 中国平安人寿保险股份有限公司 Automated testing method, device, computer installation and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张鹏;王健;: "Web应用中后台服务测试自动化的研究与实现", 计算机技术与发展, no. 04 *
许涛;许春雷;: "基于中间件的潜艇作战系统数据库访问技术", 指挥控制与仿真, no. 01 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112597000A (en) * 2020-12-09 2021-04-02 山东浪潮通软信息科技有限公司 WebScoket protocol-based automatic testing method for instant messaging software
CN112597000B (en) * 2020-12-09 2023-10-03 浪潮通用软件有限公司 Instant messaging software automatic test method based on WebScokey protocol
CN112783799A (en) * 2021-03-19 2021-05-11 中国工商银行股份有限公司 Software daemon test method and device
CN112783799B (en) * 2021-03-19 2024-05-17 中国工商银行股份有限公司 Software daemon testing method and device
CN113592305A (en) * 2021-07-29 2021-11-02 北京百度网讯科技有限公司 Test method, test device, electronic device, and storage medium
WO2023045209A1 (en) * 2021-09-27 2023-03-30 Medtrum Technologies Inc. Analyte detection device and detection method
CN114338489A (en) * 2021-12-29 2022-04-12 深圳市捷视飞通科技股份有限公司 Automatic testing method, device, equipment and storage medium for multimedia conference system
CN114338489B (en) * 2021-12-29 2024-03-15 深圳市捷视飞通科技股份有限公司 Automatic test method, device, equipment and storage medium for multimedia conference system

Similar Documents

Publication Publication Date Title
CN110764999A (en) Automatic testing method and device, computer device and storage medium
CN108984418B (en) Software test management method and device, electronic equipment and storage medium
CN109298998B (en) Workload evaluation and model training method, electronic equipment and storage medium
CN109783346B (en) Keyword-driven automatic testing method and device and terminal equipment
CN110060139B (en) Accounting processing method and device
CN112488652A (en) Work order auditing method, system, terminal and storage medium
CN106487603A (en) A kind of response test method and device
CN113554357A (en) Informatization project cost evaluation method based on big data and electronic equipment
CN109472457B (en) Loan application online reviewing method and terminal equipment
CN116502877A (en) Project progress monitoring method and device, electronic equipment and readable storage medium
US20190019120A1 (en) System and method for rendering compliance status dashboard
CN112598228B (en) Enterprise competitiveness analysis method, device, equipment and storage medium
CN110458707B (en) Behavior evaluation method and device based on classification model and terminal equipment
US8255881B2 (en) System and method for calculating software certification risks
US20080195453A1 (en) Organisational Representational System
CN112699014A (en) Method and device for testing and displaying storage performance prediction function
CN111859985A (en) AI customer service model testing method, device, electronic equipment and storage medium
CN113254352A (en) Test method, device, equipment and storage medium for test case
JPH0876992A (en) Device and method for evaluation and management of quality of software
CN111625458A (en) Service system testing method, device and equipment
CN110400041B (en) Risk auditing method, risk auditing device, computer equipment and computer readable storage medium
CN112988555B (en) Interface testing method, device, equipment and storage medium
CN114003494A (en) Automatic test method and device for data model and electronic equipment
CN114782088A (en) Information processing method, device, computing equipment and medium
CN113111073A (en) Abnormal data sorting method and device, computing equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination