CN112463599A - Automatic testing method and device, computer equipment and storage medium - Google Patents

Automatic testing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112463599A
CN112463599A CN202011298508.6A CN202011298508A CN112463599A CN 112463599 A CN112463599 A CN 112463599A CN 202011298508 A CN202011298508 A CN 202011298508A CN 112463599 A CN112463599 A CN 112463599A
Authority
CN
China
Prior art keywords
test
information
user
case
specified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011298508.6A
Other languages
Chinese (zh)
Inventor
袁璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Consumer Finance Co Ltd
Original Assignee
Ping An Consumer Finance Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Consumer Finance Co Ltd filed Critical Ping An Consumer Finance Co Ltd
Priority to CN202011298508.6A priority Critical patent/CN112463599A/en
Publication of CN112463599A publication Critical patent/CN112463599A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The application relates to the field of testing, and provides an automatic testing method, an automatic testing device, computer equipment and a storage medium, wherein the method comprises the following steps: receiving an automatic test request submitted by a user; extracting user information from the automation request, and judging whether the user has a test authority or not based on the user information; if so, extracting test case information from the automatic test request, and screening out a specified test case from a pre-stored case database based on the test case information; obtaining a test step contained in a specified test case; generating a test program corresponding to the specified test case according to the test step; acquiring preset timing test time and current time, and judging whether the current time is the same as the timing test time or not; if so, running the test program to generate a corresponding test result. The test efficiency of automatic test can be improved, and the test cost is reduced. The method and the device can also be applied to the field of block chains, and data such as the specified test cases can be stored on the block chains.

Description

Automatic testing method and device, computer equipment and storage medium
Technical Field
The application relates to the field of testing, in particular to an automatic testing method, an automatic testing device, computer equipment and a storage medium.
Background
In the existing automatic test flow, a tester needs to compile a test case by using a programming language and perform corresponding test work through third-party test software and the compiled test case. However, this automatic testing method requires a tester to use a complicated and obscure programming language to write a test case, which makes the tester have higher professional requirements, and the writing efficiency of the test case is low, thereby resulting in higher generation cost of the test case and low efficiency of the automatic testing.
Disclosure of Invention
The present application mainly aims to provide an automated testing method, an automated testing device, a computer device, and a storage medium, and aims to solve the technical problems that in the existing automated testing mode, a tester needs to use a complicated and obscure programming language to compile a test case, the writing efficiency of the test case is low, and further, the generation cost of the test case is high, and the efficiency of automated testing is low.
The application provides an automatic testing method, which comprises the following steps:
receiving an automatic test request submitted by a user, wherein the automatic test request carries user information and test case information;
analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information;
if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
obtaining the test steps contained in the specified test case;
generating a test program corresponding to the specified test case according to the test step;
acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time;
and if the current time is the same as the timing test time, running the test program to generate a corresponding test result.
Optionally, the step of determining whether the user has the test permission based on the user information includes:
acquiring service operation information corresponding to the automatic test request;
matching the service operation information based on a preset service operation safety level table, and judging whether the service operation safety level table has specified service operation information which is the same as the service operation information;
if the designated service operation information exists in the service operation safety level table, screening out a safety level coefficient corresponding to the designated service operation information;
judging whether the safety level coefficient is larger than a preset safety level threshold value or not;
if the security level coefficient is larger than the security level threshold, performing identity authentication on the user according to a preset rule to obtain a corresponding identity authentication result;
if the identity authentication result is that the authentication is passed, acquiring a first service permission score corresponding to the user information from a preset user service permission score table based on the user information; and the number of the first and second groups,
acquiring a second service permission score corresponding to the specified service operation information;
judging whether the first service permission score is smaller than the second service permission score or not;
if the first service permission score is not smaller than the second service permission score, judging that the user has a test permission;
and if the first service permission score is smaller than the second service permission score, judging that the user does not have the test permission.
Optionally, the step of authenticating the user according to a preset rule to obtain a corresponding authentication result includes:
acquiring a plurality of pre-stored questioning data;
displaying the question data based on the use record of the question data, and generating selection prompt information so that a user can select specified question data from all the question data;
after the user finishes selecting, generating answer reminding information so that the user can input feedback answer voice data corresponding to the specified question data;
recognizing the feedback answer voice data to obtain corresponding text information, and judging whether the text information is the same as preset correct answer data or not;
if the text information is the same as the correct answer data, acquiring the reply response time length of the user;
judging whether the response time length is greater than a normal response time length threshold value corresponding to the question data;
if the response time length is greater than the normal response time length threshold, extracting a corresponding voiceprint feature vector to be verified from the feedback answer voice data; and the number of the first and second groups,
obtaining a pre-stored authorized voiceprint feature vector corresponding to the correct answer data;
calculating the voiceprint similarity between the voiceprint feature vector to be verified and the authorized voiceprint feature vector;
judging whether the voiceprint similarity is greater than a preset similarity threshold value or not;
and if the voiceprint similarity is greater than the similarity threshold, generating an identity authentication result which passes the authentication, otherwise, generating an identity authentication result which does not pass the authentication.
Optionally, the test case information includes a test case keyword, and the step of screening out a specific test case corresponding to the test case information from a pre-stored case database based on the test case information includes:
based on SQL query statements, querying the case database according to the test case keywords to obtain case files corresponding to the test case keywords;
copying the use case file to obtain a copied use case file copy;
and taking the use case file copy as the specified test case.
Optionally, the step of running the test program to generate a corresponding test result includes:
running the test program, and obtaining the test duration of the test process of running the test program;
judging whether the test duration is smaller than a preset duration threshold or not;
if the test duration is less than the preset duration threshold, generating a corresponding test result;
and if the test duration is not less than the preset duration threshold, generating test feedback information of test failure.
Optionally, after the step of running the test program and generating the corresponding test result if the current time is the same as the timing test time, the method includes:
acquiring a preset expected test result;
comparing the test result with the expected test result, and judging whether the test result meets the expected test result;
if the test result meets the expected test result, generating a comparison test result which passes the test;
and if the test result does not meet the expected test result, generating a comparison test result which fails the test.
Optionally, after the step of running the test program and generating the corresponding test result if the current time is the same as the timing test time, the method includes:
acquiring an intermediate file generated in a test process of running the test program;
generating a corresponding test report according to the intermediate file, the test result and the comparison test result;
acquiring preset mail login information and acquiring a designated mail address corresponding to a designated user;
logging in a mail server according to the mail login information;
and sending the test report to the specified mail address through the mail server.
The application also provides an automatic testing arrangement, includes:
the system comprises a receiving module, a test case processing module and a test execution module, wherein the receiving module is used for receiving an automatic test request submitted by a user, and the automatic test request carries user information and test case information;
the first judgment module is used for analyzing the automation request, extracting the user information and judging whether the user has the test authority or not based on the user information;
the screening module is used for extracting the test case information from the automatic test request if the user has the test permission, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by compiling based on a specified language, and the specified language does not include a programming language;
the first acquisition module is used for acquiring the test steps contained in the specified test case;
the first generation module is used for generating a test program corresponding to the specified test case according to the test step;
the second acquisition module is used for acquiring preset timing test time, acquiring current time and judging whether the current time is the same as the timing test time or not;
and the second generation module is used for running the test program to generate a corresponding test result if the current time is the same as the timing test time.
The present application further provides a computer device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the above method when executing the computer program.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method.
The automated testing method, the automated testing device, the computer equipment and the storage medium have the following beneficial effects:
according to the automatic test method, the automatic test device, the computer equipment and the storage medium, when an automatic test request submitted by a user is received, whether the user has the test authority or not can be verified based on user information carried in the automatic test request, if the user has the test authority, a specified test case generated by compiling a tester in a specified language without a programming language can be quickly extracted from a preset case database based on the test information carried in the automatic test request, and a test program which can be automatically executed by the automatic test device can be quickly and intelligently generated according to the specified test case, so that the tester can finish the automatic test without mastering a complex programming language, and the tester can place the work gravity center in the design of the test case, thereby effectively improving the test efficiency, the test authority and the test safety of the automatic test, The test cost of the automatic test is reduced. In addition, test case data contained in the case database is compiled and generated by testers by using a specified language familiar to the testers, so that the test intention of the users can be intuitively and clearly embodied, the generation efficiency of the test cases is effectively improved, and the working efficiency of the testers is improved.
Drawings
FIG. 1 is a schematic flow chart of an automated testing method according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an automated testing apparatus according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Referring to fig. 1, an automated testing method according to an embodiment of the present application includes:
s1: receiving an automatic test request submitted by a user, wherein the automatic test request carries user information and test case information;
s2: analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information;
s3: if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
s4: obtaining the test steps contained in the specified test case;
s5: generating a test program corresponding to the specified test case according to the test step;
s6: acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time;
s7: and if the current time is the same as the timing test time, running the test program to generate a corresponding test result.
As described in the above steps S1-S7, the main implementation of the present method is an automated testing apparatus. In practical applications, the automatic testing apparatus may be implemented by a virtual device, such as a software code, or by an entity device in which a relevant execution code is written or integrated, and may perform human-computer interaction with a user through a keyboard, a mouse, a remote controller, a touch panel, or a voice control device. The automatic testing device in the embodiment can generate the testing program which can be automatically executed by the automatic testing device according to the testing case which is written by the tester in the specified language (such as Chinese language) without including the programming language, so that the tester can complete the automatic testing without mastering the complex programming language, and the testing efficiency of the automatic testing is effectively improved. Specifically, an automatic test request submitted by a user is received, wherein the automatic test request carries user information and test case information, the automatic test request is generated when the user has a requirement for carrying out an automatic test, the user information is information corresponding to test permission verification of the user, the test case information is information of a specified test case required by the automatic test, and the test case information can include a test case keyword. And then analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information. The permission score corresponding to the user information and the permission score corresponding to the automatic test request can be obtained, and the two permission scores are compared in numerical value to judge whether the user has the test permission. And if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language. Specifically, the specific language may be any natural language, and the natural language refers to a language that naturally evolves with culture, for example, the natural language may include chinese, japanese, korean, english, and the like. Preferably, the designated language in the embodiment is a chinese language, and the automated testing method of the present application will be specifically described below by taking the designated language as a chinese language as an example. However, the specified language in the present embodiment is not limited to the chinese language, and other natural languages, such as japanese, korean, and the like, may be used as the specified language. In addition, the test case information includes a test case keyword, and the case database may be queried according to the test case keyword based on an SQL query statement, so as to obtain a case file corresponding to the test case keyword and use the case file as the specified test case. In addition, test case data contained in the case database is generated by writing the test case data by a tester by using a specified language familiar to the tester, so that the test intention of the user can be intuitively and clearly embodied. And after the specified test case is obtained, obtaining the test steps contained in the specified test case, and generating a test program corresponding to the specified test case according to the test steps. The specified test case is generated by a tester for the purpose of actual test on the target test object based on Chinese language writing. For example, the test purpose may be to test whether the target software functions normally. The specified test case comprises at least one test step for describing test behaviors, namely the test step is a test operation performed on a target test object. The specified test case is generated based on Chinese language writing, and most characters in the characters except punctuations in the specified test case are the Chinese language. In addition, the test steps in the test case are appointed to be generated by writing according to a preset writing rule, so that subsequent analysis is facilitated. The writing rule refers to that the test step needs to be matched with a preset step pattern, the writing rule can be flexibly set by a tester according to actual needs, the step pattern can be a regular expression based on the Chinese language, that is, in the regular expression based on the Chinese language, except for specific indication characters of the regular expression, other characters are expressions based on the Chinese language, for example, the writing rule can be in regular forms such as 'enter …', 'execute …', 'click …', and the like. After the test steps included in the specified test case are obtained, the test steps can be further converted into corresponding test programs by using corresponding conversion tools, and after the test programs are obtained, the test programs can be operated to realize the automatic test corresponding to the specified test case. The conversion tool is specifically limited, and may be selected according to implementation requirements, for example, a javasissst tool, an ASM tool, and the like may be used, which will not be described herein too much. And then acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time. And if the current time is the same as the timing test time, running the test program to generate a corresponding test result. The timing test time is not particularly limited, and may be set according to actual requirements. The automatic test can be executed in a planned way according to the actual situation by setting the timing test time, so that the corresponding automatic test can be carried out based on the specified test case only after the current time reaches the timing test time, the influence of the automatic test flow corresponding to the specified test case on other current test flows is effectively avoided, and the test intelligence of the automatic test is improved. According to the embodiment, the test case which is written by the tester in the specified language without the programming language can be used for quickly and intelligently generating the test program which can be automatically executed by the automatic test device, so that the tester can finish the automatic test without mastering the complex programming language, and the tester can place the working gravity center in the design of the test case, thereby effectively improving the test efficiency of the automatic test and reducing the test cost of the automatic test. In addition, the test case is compiled and generated by the tester by using the specified language familiar to the tester, so that the test intention of the user can be intuitively and clearly reflected, the generation efficiency of the test case is effectively improved, and the working efficiency of the tester is improved.
Further, in an embodiment of the application, the step of determining whether the user has the test authority based on the user information in the step S2 includes:
s200: acquiring service operation information corresponding to the automatic test request;
s201: matching the service operation information based on a preset service operation safety level table, and judging whether the service operation safety level table has specified service operation information which is the same as the service operation information;
s202: if the designated service operation information exists in the service operation safety level table, screening out a safety level coefficient corresponding to the designated service operation information;
s203: judging whether the safety level coefficient is larger than a preset safety level threshold value or not;
s204: if the security level coefficient is larger than the security level threshold, performing identity authentication on the user according to a preset rule to obtain a corresponding identity authentication result;
s205: if the identity authentication result is that the authentication is passed, acquiring a first service permission score corresponding to the user information from a preset user service permission score table based on the user information; and the number of the first and second groups,
s206: acquiring a second service permission score corresponding to the specified service operation information;
s207: judging whether the first service permission score is smaller than the second service permission score or not;
s208: if the first service permission score is not smaller than the second service permission score, judging that the user has a test permission;
s209: and if the first service permission score is smaller than the second service permission score, judging that the user does not have the test permission.
As described in the foregoing steps S200 to S209, after receiving the automatic test request submitted by the user, the user needs to perform test permission verification processing on the user, so as to ensure that the user has a test permission to execute the automatic test, avoid a situation that adverse consequences are caused by responding to the automatic test request input by an illegal user, and ensure safety in the process of processing the automatic test request. Specifically, the step of determining whether the user has the test authority based on the user information may include: firstly, business operation information corresponding to the automatic test request is obtained. And then, matching the service operation information based on a preset service operation security level table, and judging whether the designated service operation information identical to the service operation information exists in the service operation security level table. The service operation security level table is pre-stored with security level information corresponding to each service operation one by one, if the service operation security level table does not record specified service operation information which is the same as the service operation information, the service operation is represented as an operation which does not need authority verification, and if the specified service operation information exists, the service operation is represented as an operation which needs authority verification. And if the specified business operation information exists in the business operation safety level table, screening out a safety level coefficient corresponding to the specified business operation information, and judging whether the safety level coefficient is greater than a preset safety level threshold value. Wherein, the safety level coefficient can be divided into a plurality of levels, and the higher the level is, the higher the safety level is. The security level threshold is a threshold value corresponding to whether identity authentication is required, and if the security level coefficient of the designated service operation information is greater than the security level threshold, it indicates that the designated service operation corresponds to an operation requiring identity authentication and permission authentication, and then further identity authentication processing is required for the user. And if the security level coefficient is not greater than the security level threshold, the specified service operation is corresponding to the operation only needing authority verification, and the user does not need to be subjected to identity verification subsequently. And if the security level coefficient is larger than the security level threshold, performing identity authentication on the user according to a preset rule to obtain a corresponding identity authentication result. The preset rule is not specifically limited, and the user may be authenticated by determining whether the user information exists in a preset identity information database, or may be authenticated by performing face recognition, living body detection, and the like on the user. If the identity authentication result is that the authentication is passed, acquiring a first service permission score corresponding to the user information from a preset user service permission score table based on the user information; and acquiring a second service permission score corresponding to the specified service operation information. User role information and a service authority score corresponding to each user role information one by one are recorded in the user service authority score table, and the service authority score corresponds to the operable service authority of the user role. In addition, the second permission score can be obtained through a preset service operation permission score table, and service permission scores corresponding to each service operation one to one are recorded in the service operation permission score table. And then judging whether the first service permission score is smaller than the second service permission score. And if the first service permission score is not smaller than the second service permission score, judging that the user has the test permission. And if the first service permission score is smaller than the second service permission score, judging that the user does not have the test permission, and returning error information of insufficient test permission. In the embodiment, the verification processing about the test permission is performed on the user by combining a plurality of service verification modes such as the service operation security level table, the user service permission score table, the service operation permission score table and the like, so that the response processing of the user with the test permission corresponding to the automatic test request is effectively ensured, and the data security in the processing process of the automatic test request is improved.
Further, in an embodiment of the application, the step S204 includes:
s2040: acquiring a plurality of pre-stored questioning data;
s2041: displaying the question data based on the use record of the question data, and generating selection prompt information so that a user can select specified question data from all the question data;
s2042: after the user finishes selecting, generating answer reminding information so that the user can input feedback answer voice data corresponding to the specified question data;
s2043: recognizing the feedback answer voice data to obtain corresponding text information, and judging whether the text information is the same as preset correct answer data or not;
s2044: if the text information is the same as the correct answer data, acquiring the reply response time length of the user;
s2045: judging whether the response time length is greater than a normal response time length threshold value corresponding to the question data;
s2046: if the response time length is greater than the normal response time length threshold, extracting a corresponding voiceprint feature vector to be verified from the feedback answer voice data; and the number of the first and second groups,
s2047: obtaining a pre-stored authorized voiceprint feature vector corresponding to the correct answer data;
s2048: calculating the voiceprint similarity between the voiceprint feature vector to be verified and the authorized voiceprint feature vector;
s2049: judging whether the voiceprint similarity is greater than a preset similarity threshold value or not;
s2050: and if the voiceprint similarity is greater than the similarity threshold, generating an identity authentication result which passes the authentication, otherwise, generating an identity authentication result which does not pass the authentication.
As described in the foregoing steps S2040 to S2050, the step of performing authentication on the user according to the preset rule to obtain a corresponding authentication result may specifically include: first, a plurality of pre-stored questioning data are obtained. The question data may be a plurality of question data previously entered by a target user corresponding to the user information to satisfy an authentication procedure, and correct answer voice data made by the target user corresponding to each question data is entered at the same time. And then, displaying the question data based on the use record of the question data, and generating selection prompt information so that a user can select specified question data from all the question data. The questioning data can be sequentially arranged from a plurality of orders according to the use records of the questioning data, so that a user can preferentially check the questioning data with more use records, can select the questioning data which is frequently answered, can return corresponding correct answers quickly and accurately, and the verification efficiency of identity verification can be improved. And after the user finishes the selection, generating answer reminding information so that the user can input feedback answer voice data corresponding to the specified question data. After the feedback answer voice data is obtained, recognizing the feedback answer voice data to obtain corresponding text information, and judging whether the text information is the same as preset correct answer data or not. Wherein the text information corresponding to the feedback answer voice data can be recognized based on the existing voice recognition technology. And if the text information is the same as the correct answer data, acquiring the response time length of the user, and judging whether the response time length is greater than a normal response time length threshold value corresponding to the question data. Wherein, the step of calculating the response time of the user may include: acquiring first time for displaying the specified question data on the current page; acquiring a second time for the user to input the voice data of the feedback answer; calculating a difference between the second time and the first time; the difference is determined as the response time period. In addition, the response time length of the reply is provided by the userThe time duration used after the questions in the questioning data are answered after the contents of the questioning data are questioned is long. The normal reaction duration threshold may be set based on empirical data. For example, a plurality of feedback data for a normal response situation may be collected, and the response time duration of the plurality of feedback data may be counted to determine the normal response time duration threshold. And if the response time length is greater than the normal response time length threshold, extracting the corresponding voiceprint feature vector to be verified from the feedback answer voice data. And acquiring a pre-stored authorized voiceprint feature vector corresponding to the correct answer data. The voiceprint feature vector to be verified and the authorized voiceprint feature vector can be extracted based on the existing voiceprint extraction algorithm. And then calculating the voiceprint similarity between the voiceprint feature vector to be verified and the authorized voiceprint feature vector, and judging whether the voiceprint similarity is greater than a preset similarity threshold value. The method of calculating the voiceprint similarity is not particularly limited. Preferably, a distance calculation formula can be used
Figure BDA0002786133860000131
And calculating the voiceprint similarity, wherein a is a voiceprint feature vector to be verified, and b is the authorized voiceprint feature vector. In addition, the similarity threshold is not particularly limited, and may be set according to actual requirements, for example, may be set to 0.95. And if the voiceprint similarity is greater than the similarity threshold, generating an identity authentication result which passes the authentication, otherwise, generating an identity authentication result which does not pass the authentication. According to the embodiment, the accurate identity authentication processing of the user is realized by adopting multiple identity authentication modes such as question asking, response time length authentication, voiceprint authentication and the like, the adaptability, the accuracy and the reliability of the identity authentication are ensured, bad consequences caused by responding to an automatic test request input by an illegal user are avoided, and the safety in the process of processing the automatic test request is ensured.
Further, in an embodiment of the present application, the test case information includes a test case keyword, and the step of screening out the specified test case corresponding to the test case information from a pre-stored case database based on the test case information in the step S3 includes:
s300: based on SQL query statements, querying the case database according to the test case keywords to obtain case files corresponding to the test case keywords;
s301: copying the use case file to obtain a copied use case file copy;
s302: and taking the use case file copy as the specified test case.
As described in the foregoing steps S300 to S302, the test case information includes a test case keyword, and the step of screening out the specified test case corresponding to the test case information from a pre-stored case database based on the test case information may specifically include: firstly, based on SQL query statement, querying the case database according to the test case keywords to obtain case files corresponding to the test case keywords. The connection with the pre-stored use case database can be established firstly, namely the connection handle with the use case database is obtained, and then the use case database is connected through the connection handle. After the connection with the use case database is successful, a specific SQL query statement is constructed and a corresponding function is called to execute the SQL query statement. The SQL query statement is specifically a select statement, and the content of the SQL query statement may be select · from · · included. By utilizing the SQL query statement and then querying the case database according to the test case keywords, the case file corresponding to the test case keywords can be queried and obtained from the case database, and the query efficiency and the query accuracy of the case file are ensured. In addition, the test cases can be classified according to keywords to establish a case database, that is, after a mapping relation is established between each test case and the corresponding case keyword, the test cases are correspondingly stored in a preset database to complete the creation of the case database. The use case keywords may include item attribute keywords, summary keywords, step keywords, and the like. . And then copying the use case file to obtain a copied use case file copy. After the use case file is obtained, a preset blank document can be obtained, and then all contents of the use case file obtained through query are copied into the blank document to generate the use case file copy. And finally, when the use case file copy is obtained, taking the use case file copy as the specified test case. According to the method and the device, based on the test case keywords carried in the automatic test request, the SQL query statement is used for rapidly querying the specified test case required by the automatic test from the preset case database, so that the corresponding automatic test flow can be completed according to the specified test case, and the test efficiency of the automatic test is improved.
Further, in an embodiment of the present application, the step S7 includes:
s700: running the test program, and obtaining the test duration of the test process of running the test program;
s701: judging whether the test duration is smaller than a preset duration threshold or not;
s702: if the test duration is less than the preset duration threshold, generating a corresponding test result;
s703: and if the test duration is not less than the preset duration threshold, generating test feedback information of test failure.
As described in the foregoing steps S700 to S703, the step of running the test program to generate a corresponding test result may specifically include: firstly, the test program is operated, and the test duration of the test process of the test program is obtained. The test duration can be obtained by calculating the test starting time and the test ending time in the test process, that is, the test duration is the test ending time-the test starting time, and whether the test process of the current automatic test is smoothly performed can be further judged according to the test duration. And then judging whether the test duration is smaller than a preset duration threshold. The value of the preset time length threshold is not specifically limited, and can be set according to actual requirements. If the test duration is less than the preset duration threshold, indicating that the test flow is successful, generating a corresponding test result. And if the test duration is not less than the preset duration threshold, the test flow at the moment is failed, and test feedback information of the test failure is generated. In the embodiment, the test duration of the test process of running the test program is calculated and compared with the preset duration threshold, so that the corresponding test result or test feedback information of test failure can be generated according to the obtained comparison result, and the actual test condition in the automatic test process can be known in time according to the test result or the test feedback information.
Further, in an embodiment of the present application, after the step S7, the method includes:
s710: acquiring a preset expected test result;
s711: comparing the test result with the expected test result, and judging whether the test result meets the expected test result;
s712: if the test result meets the expected test result, generating a comparison test result which passes the test;
s713: and if the test result does not meet the expected test result, generating a comparison test result which fails the test.
As described in the foregoing steps S710 to S713, after the step of running the test program and generating the corresponding test result if the current time is the same as the timing test time, the method may further include: and acquiring a preset expected test result. The expected test result can be obtained based on the specified test case, and the expected test result is a standard test result which can be realized by the test program under normal operation. And then comparing the test result with the expected test result to judge whether the test result meets the expected test result. Wherein if the value of the test result is within the range of the expected test result, it is determined that the test result satisfies the expected test result, and if the value of the test result is not within the range of the expected test result, it is determined that the test result does not satisfy the expected test result. And if the test result meets the expected test result, generating a comparison test result which passes the test. And if the test result does not meet the expected test result, generating a comparison test result with failed test. In the embodiment, the test result obtained by running the test program is compared with the expected test result, so that the difference between the test result and the expected test result can be quickly found, the corresponding comparison test result can be generated, and the actual test condition of the automatic test can be accurately obtained according to the comparison test result.
Further, in an embodiment of the present application, after the step S7, the method includes:
s720: acquiring an intermediate file generated in a test process of running the test program;
s721: generating a corresponding test report according to the intermediate file, the test result and the comparison test result;
s722: acquiring preset mail login information and acquiring a designated mail address corresponding to a designated user;
s723: logging in a mail server according to the mail login information;
s724: and sending the test report to the specified mail address through the mail server.
As described in the foregoing steps S720 to S724, after the step of running the test program and generating the corresponding test result if the current time is the same as the timing test time, the method may further include: and acquiring an intermediate file generated in the test process of running the test program. And then generating a corresponding test report according to the intermediate file, the test result and the comparison test result. And filling the intermediate file, the test result and the comparison test result into corresponding positions of the test report template, so that a test report corresponding to the automatic test can be generated. And then acquiring preset mail login information and acquiring a designated mail address corresponding to the designated user. After the mail registration information and the designated mail address are obtained, the mail server is registered according to the mail registration information. And finally, sending the test report to the specified mail address through the mail server. After the test program is run, the embodiment further generates a test report including the intermediate file, the test result, and the comparison test result. In addition, in order to facilitate the related users to perform remote analysis and operation on the automatic test, the related users can also automatically log in the mail server and automatically send the test report to a preset designated mail address through the mail server, so that the related users can timely know the test condition in the automatic test process, find the test problem and further improve the test efficiency of the automatic test.
The automated testing method in the embodiment of the application can also be applied to the field of block chains, for example, data such as the specified test case is stored on the block chain. By using the block chain to store and manage the specified test case, the safety and the non-tamper property of the specified test case can be effectively ensured.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism and an encryption algorithm. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The block chain underlying platform can comprise processing modules such as user management, basic service, intelligent contract and operation monitoring. The user management module is responsible for identity information management of all blockchain participants, and comprises public and private key generation maintenance (account management), key management, user real identity and blockchain address corresponding relation maintenance (authority management) and the like, and under the authorization condition, the user management module supervises and audits the transaction condition of certain real identities and provides rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node equipment and used for verifying the validity of the service request, recording the service request to storage after consensus on the valid request is completed, for a new service request, the basic service firstly performs interface adaptation analysis and authentication processing (interface adaptation), then encrypts service information (consensus management) through a consensus algorithm, transmits the service information to a shared account (network communication) completely and consistently after encryption, and performs recording and storage; the intelligent contract module is responsible for registering and issuing contracts, triggering the contracts and executing the contracts, developers can define contract logics through a certain programming language, issue the contract logics to a block chain (contract registration), call keys or other event triggering and executing according to the logics of contract clauses, complete the contract logics and simultaneously provide the function of upgrading and canceling the contracts; the operation monitoring module is mainly responsible for deployment, configuration modification, contract setting, cloud adaptation in the product release process and visual output of real-time states in product operation, such as: alarm, monitoring network conditions, monitoring node equipment health status, and the like.
Referring to fig. 2, an embodiment of the present application further provides an automatic testing apparatus, including:
the system comprises a receiving module 1, a test case information processing module and a test result processing module, wherein the receiving module is used for receiving an automatic test request submitted by a user, and the automatic test request carries user information and test case information;
the first judgment module 2 is used for analyzing the automation request, extracting the user information and judging whether the user has the test authority or not based on the user information;
the screening module 3 is configured to extract the test case information from the automated test request if the user has a test permission, and screen out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, where the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
a first obtaining module 4, configured to obtain the test steps included in the specified test case;
the first generating module 5 is used for generating a test program corresponding to the specified test case according to the test step;
the second obtaining module 6 is configured to obtain a preset timing test time, obtain a current time, and determine whether the current time is the same as the timing test time;
and the second generating module 7 is configured to run the test program to generate a corresponding test result if the current time is the same as the timing test time.
In this embodiment, the implementation processes of the functions and functions of the receiving module, the first determining module, the screening module, the first obtaining module, the first generating module, the second obtaining module, and the second generating module in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S1 to S7 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the application, the first determining module includes:
the first acquisition unit is used for acquiring service operation information corresponding to the automatic test request;
a first judging unit, configured to perform matching processing on the service operation information based on a preset service operation security level table, and judge whether designated service operation information identical to the service operation information exists in the service operation security level table;
the screening unit is used for screening out a safety level coefficient corresponding to the specified business operation information if the specified business operation information exists in the business operation safety level table;
the second judgment unit is used for judging whether the safety level coefficient is larger than a preset safety level threshold value or not;
the verification unit is used for verifying the identity of the user according to a preset rule if the security level coefficient is greater than the security level threshold value to obtain a corresponding identity verification result;
a second obtaining unit, configured to obtain, based on the user information, a first service permission score corresponding to the user information from a preset user service permission score table if the identity authentication result is that the authentication passes; and the number of the first and second groups,
a third obtaining unit, configured to obtain a second service permission score corresponding to the specified service operation information;
a third judging unit, configured to judge whether the first service permission score is smaller than the second service permission score;
the first judging unit is used for judging that the user has the testing authority if the first service authority score is not smaller than the second service authority score;
and the second judging unit is used for judging that the user does not have the test permission if the first service permission score is smaller than the second service permission score.
In this embodiment, the implementation processes of the functions and functions of the first obtaining unit, the first determining unit, the screening unit, the second determining unit, the verifying unit, the second obtaining unit, the third determining unit, the first determining unit and the second determining unit in the automatic testing apparatus are specifically described in the implementation processes corresponding to steps S200 to S209 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the present application, the verification unit includes:
the first acquiring subunit is used for acquiring a plurality of pre-stored questioning data;
the display subunit is used for displaying the question data based on the use record of the question data and generating selection prompt information so that a user can select specified question data from all the question data;
the first generating subunit is used for generating answer reminding information after the user finishes selecting so that the user can input feedback answer voice data corresponding to the specified question data;
the first judgment subunit is used for identifying the feedback answer voice data to obtain corresponding text information and judging whether the text information is the same as preset correct answer data or not;
the second obtaining subunit is configured to obtain a response duration of the response of the user if the text information is the same as the correct answer data;
the second judgment subunit is configured to judge whether the response time length is greater than a normal response time length threshold corresponding to the question data;
the extraction subunit is configured to extract a corresponding voiceprint feature vector to be verified from the feedback answer voice data if the response time length is greater than the normal response time length threshold; and the number of the first and second groups,
a third obtaining subunit, configured to obtain a pre-stored authorized voiceprint feature vector corresponding to the correct answer data;
the calculation subunit is configured to calculate a voiceprint similarity between the to-be-verified voiceprint feature vector and the authorized voiceprint feature vector;
a third judging subunit, configured to judge whether the voiceprint similarity is greater than a preset similarity threshold;
and the second generation subunit is used for generating an identity verification result which passes the verification if the voiceprint similarity is greater than the similarity threshold, and otherwise, generating an identity verification result which does not pass the verification.
In this embodiment, the implementation processes of the functions and actions of the first obtaining subunit, the displaying subunit, the first generating subunit, the first judging subunit, the second obtaining subunit, the second judging subunit, the extracting subunit, the third obtaining subunit, the calculating subunit, the third judging subunit and the second generating subunit in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S2040 to S2050 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the present application, the screening module includes:
the query unit is used for querying the case database according to the test case keywords based on SQL query statements to obtain case files corresponding to the test case keywords;
the copying unit is used for copying the use case file to obtain a copied use case file copy;
and the determining unit is used for taking the use case file copy as the specified test case.
In this embodiment, the implementation processes of the functions and functions of the query unit, the copy unit, and the determination unit in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S300 to S302 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the application, the second generating module includes:
the running unit is used for running the test program and acquiring the test duration of the test process of running the test program;
the fourth judging unit is used for judging whether the test duration is smaller than a preset duration threshold or not;
a third determination unit, configured to generate a corresponding test result if the test duration is less than the preset duration threshold;
and the fourth judging unit is used for generating test feedback information of test failure if the test duration is not less than the preset duration threshold.
In this embodiment, the implementation processes of the functions and functions of the operation unit, the fourth determination unit, the third determination unit and the fourth determination unit in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S700 to S703 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the present application, the automatic testing apparatus includes:
the third acquisition module is used for acquiring a preset expected test result;
the comparison module is used for comparing the test result with the expected test result and judging whether the test result meets the expected test result;
a third generating module, configured to generate a comparison test result that passes the test if the test result meets the expected test result;
and the fourth generation module is used for generating a comparison test result which fails the test if the test result does not meet the expected test result.
In this embodiment, the implementation processes of the functions and functions of the third obtaining module, the comparing module, the third generating module and the fourth generating module in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S710 to S713 in the automatic testing method, and are not described herein again.
Further, in an embodiment of the present application, the automatic testing apparatus includes:
the fourth acquisition module is used for acquiring an intermediate file generated in the test process of running the test program;
a fifth generating module, configured to generate a corresponding test report according to the intermediate file, the test result, and the comparison test result;
the fifth acquisition module is used for acquiring preset mail login information and acquiring a designated mail address corresponding to a designated user;
the login module is used for logging in the mail server according to the mail login information;
and the sending module is used for sending the test report to the specified mail address through the mail server.
In this embodiment, the implementation processes of the functions and functions of the fourth obtaining module, the fifth generating module, the fifth obtaining module, the login module and the sending module in the automatic testing apparatus are specifically detailed in the implementation processes corresponding to steps S720 to S724 in the automatic testing method, and are not described herein again.
Referring to fig. 3, in an embodiment of the present application, a computer device is also provided, where the computer device may be a server, and the computer device may also be a terminal, and an internal structure of the computer device may be as shown in fig. 3. The computer device comprises a processor, a memory, a network interface, a display screen, an input device and a database which are connected through a system bus. Wherein the processor of the computer device is designed to provide computing and control capabilities. The memory of the computer device comprises a storage medium and an internal memory. The storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and computer programs in the storage medium to run. The database of the computer device is used for storing user information, test case information, specified test cases, test steps, test programs and test results. The network interface of the computer device is used for communicating with an external terminal through a network connection. The display screen of the computer equipment is an indispensable image-text output equipment in the computer, and is used for converting digital signals into optical signals so that characters and figures are displayed on the screen of the display screen. The input device of the computer equipment is the main device for information exchange between the computer and the user or other equipment, and is used for transmitting data, instructions, some mark information and the like to the computer. The computer program is executed by a processor to implement an automated testing method.
The processor executes the steps of the automated testing method:
receiving an automatic test request submitted by a user, wherein the automatic test request carries user information and test case information;
analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information;
if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
obtaining the test steps contained in the specified test case;
generating a test program corresponding to the specified test case according to the test step;
acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time;
and if the current time is the same as the timing test time, running the test program to generate a corresponding test result.
Those skilled in the art will appreciate that the structure shown in fig. 3 is only a block diagram of a part of the structure related to the present application, and does not constitute a limitation to the apparatus and the computer device to which the present application is applied.
An embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements an automated testing method, and specifically:
receiving an automatic test request submitted by a user, wherein the automatic test request carries user information and test case information;
analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information;
if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
obtaining the test steps contained in the specified test case;
generating a test program corresponding to the specified test case according to the test step;
acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time;
and if the current time is the same as the timing test time, running the test program to generate a corresponding test result.
To sum up, according to the automated testing method, apparatus, computer device and storage medium provided in the embodiments of the present application, when receiving an automated testing request submitted by a user, it is verified whether the user has a testing right or not based on user information carried in the automated testing request, and if the user has the testing right, it is further based on the testing information carried in the automated testing request, a specified testing case generated by a tester written in a specified language excluding a programming language is quickly extracted from a preset case database, and then a testing program automatically executable by the automated testing apparatus can be quickly and intelligently generated according to the specified testing case, so that the tester can complete the automated testing without mastering a complex programming language, and the tester can place the work center in the design of the testing case, thereby effectively improving the testing efficiency and the testing efficiency of the automated testing, The test cost of the automatic test is reduced. In addition, test case data contained in the case database is compiled and generated by testers by using a specified language familiar to the testers, so that the test intention of the users can be intuitively and clearly embodied, the generation efficiency of the test cases is effectively improved, and the working efficiency of the testers is improved.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. Any reference to memory, storage, database, or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. An automated testing method, comprising:
receiving an automatic test request submitted by a user, wherein the automatic test request carries user information and test case information;
analyzing the automation request, extracting the user information, and judging whether the user has the test authority or not based on the user information;
if the user has the test permission, extracting the test case information from the automatic test request, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by writing based on a specified language, and the specified language does not include a programming language;
obtaining the test steps contained in the specified test case;
generating a test program corresponding to the specified test case according to the test step;
acquiring preset timing test time, acquiring current time, and judging whether the current time is the same as the timing test time;
and if the current time is the same as the timing test time, running the test program to generate a corresponding test result.
2. The automated testing method of claim 1, wherein the step of determining whether the user has the testing authority based on the user information comprises:
acquiring service operation information corresponding to the automatic test request;
matching the service operation information based on a preset service operation safety level table, and judging whether the service operation level table has specified service operation information which is the same as the service operation information;
if the designated service operation information exists in the service operation safety level table, screening out a safety level coefficient corresponding to the designated service operation information;
judging whether the safety level coefficient is larger than a preset safety level threshold value or not;
if the security level coefficient is larger than the security level threshold, performing identity authentication on the user according to a preset rule to obtain a corresponding identity authentication result;
if the identity authentication result is that the authentication is passed, acquiring a first service permission score corresponding to the user information from a preset user service permission score table based on the user information; and the number of the first and second groups,
acquiring a second service permission score corresponding to the specified service operation information;
judging whether the first service permission score is smaller than the second service permission score or not;
if the first service permission score is not smaller than the second service permission score, judging that the user has a test permission;
and if the first service permission score is smaller than the second service permission score, judging that the user does not have the test permission.
3. The automated testing method of claim 2, wherein the step of authenticating the user according to the preset rule to obtain the corresponding authentication result comprises:
acquiring a plurality of pre-stored questioning data;
displaying the question data based on the use record of the question data, and generating selection prompt information so that a user can select specified question data from all the question data;
after the user finishes selecting, generating answer reminding information so that the user can input feedback answer voice data corresponding to the specified question data;
recognizing the feedback answer voice data to obtain corresponding text information, and judging whether the text information is the same as preset correct answer data or not;
if the text information is the same as the correct answer data, acquiring the reply response time length of the user;
judging whether the response time length is greater than a normal response time length threshold value corresponding to the question data;
if the response time length is greater than the normal response time length threshold, extracting a corresponding voiceprint feature vector to be verified from the feedback answer voice data; and the number of the first and second groups,
obtaining a pre-stored authorized voiceprint feature vector corresponding to the correct answer data;
calculating the voiceprint similarity between the voiceprint feature vector to be verified and the authorized voiceprint feature vector;
judging whether the voiceprint similarity is greater than a preset similarity threshold value or not;
and if the voiceprint similarity is greater than the similarity threshold, generating an identity authentication result which passes the authentication, otherwise, generating an identity authentication result which does not pass the authentication.
4. The automated testing method of claim 1, wherein the test case information includes test case keywords, and the step of screening out the specified test case corresponding to the test case information from a pre-stored case database based on the test case information includes:
based on SQL query statements, querying the case database according to the test case keywords to obtain case files corresponding to the test case keywords;
copying the use case file to obtain a copied use case file copy;
and taking the use case file copy as the specified test case.
5. The automated testing method of claim 1, wherein the step of running the test program to generate the corresponding test results comprises:
running the test program, and obtaining the test duration of the test process of running the test program;
judging whether the test duration is smaller than a preset duration threshold or not;
if the test duration is less than the preset duration threshold, generating a corresponding test result;
and if the test duration is not less than the preset duration threshold, generating test feedback information of test failure.
6. The automated testing method of claim 1, wherein the step of running the test program to generate the corresponding test result if the current time is the same as the timing test time comprises:
acquiring a preset expected test result;
comparing the test result with the expected test result, and judging whether the test result meets the expected test result;
if the test result meets the expected test result, generating a comparison test result which passes the test;
and if the test result does not meet the expected test result, generating a comparison test result which fails the test.
7. The automated testing method of claim 6, wherein the step of running the test program to generate the corresponding test result if the current time is the same as the timing test time comprises:
acquiring an intermediate file generated in a test process of running the test program;
generating a corresponding test report according to the intermediate file, the test result and the comparison test result;
acquiring preset mail login information and acquiring a designated mail address corresponding to a designated user;
logging in a mail server according to the mail login information;
and sending the test report to the specified mail address through the mail server.
8. An automated testing apparatus, comprising:
the system comprises a receiving module, a test case processing module and a test execution module, wherein the receiving module is used for receiving an automatic test request submitted by a user, and the automatic test request carries user information and test case information;
the first judgment module is used for analyzing the automation request, extracting the user information and judging whether the user has the test authority or not based on the user information;
the screening module is used for extracting the test case information from the automatic test request if the user has the test permission, and screening out a specified test case corresponding to the test case information from a pre-stored case database based on the test case information, wherein the specified test case is generated by compiling based on a specified language, and the specified language does not include a programming language;
the first acquisition module is used for acquiring the test steps contained in the specified test case;
the first generation module is used for generating a test program corresponding to the specified test case according to the test step;
the second acquisition module is used for acquiring preset timing test time, acquiring current time and judging whether the current time is the same as the timing test time or not;
and the second generation module is used for running the test program to generate a corresponding test result if the current time is the same as the timing test time.
9. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011298508.6A 2020-11-18 2020-11-18 Automatic testing method and device, computer equipment and storage medium Pending CN112463599A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011298508.6A CN112463599A (en) 2020-11-18 2020-11-18 Automatic testing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011298508.6A CN112463599A (en) 2020-11-18 2020-11-18 Automatic testing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112463599A true CN112463599A (en) 2021-03-09

Family

ID=74836831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011298508.6A Pending CN112463599A (en) 2020-11-18 2020-11-18 Automatic testing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112463599A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113076258A (en) * 2021-04-21 2021-07-06 中国移动通信集团陕西有限公司 Permission application method, device, equipment and readable storage medium
CN113392023A (en) * 2021-06-30 2021-09-14 展讯半导体(成都)有限公司 Automatic testing method and device, computer equipment, chip and module equipment
CN116501610A (en) * 2023-03-13 2023-07-28 深圳华锐分布式技术股份有限公司 Method, device, equipment and medium for testing market transaction system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113076258A (en) * 2021-04-21 2021-07-06 中国移动通信集团陕西有限公司 Permission application method, device, equipment and readable storage medium
CN113076258B (en) * 2021-04-21 2023-09-19 中国移动通信集团陕西有限公司 Method, device and equipment for applying permission and readable storage medium
CN113392023A (en) * 2021-06-30 2021-09-14 展讯半导体(成都)有限公司 Automatic testing method and device, computer equipment, chip and module equipment
CN116501610A (en) * 2023-03-13 2023-07-28 深圳华锐分布式技术股份有限公司 Method, device, equipment and medium for testing market transaction system
CN116501610B (en) * 2023-03-13 2024-03-01 深圳华锐分布式技术股份有限公司 Method, device, equipment and medium for testing market transaction system

Similar Documents

Publication Publication Date Title
CN112463599A (en) Automatic testing method and device, computer equipment and storage medium
CN112527630B (en) Test case generation method, device, computer equipment and storage medium
CN112347310A (en) Event processing information query method and device, computer equipment and storage medium
WO2020082673A1 (en) Invoice inspection method and apparatus, computing device and storage medium
CN112464117A (en) Request processing method and device, computer equipment and storage medium
CN109359113B (en) Tax payment report checking method and device, storage medium and server
CN113742776A (en) Data verification method and device based on biological recognition technology and computer equipment
CN111797605A (en) Report generation method and device based on report template and computer equipment
CN113241138B (en) Medical event information extraction method and device, computer equipment and storage medium
CN113642039A (en) Configuration method and device of document template, computer equipment and storage medium
CN114817055A (en) Regression testing method and device based on interface, computer equipment and storage medium
CN112836061A (en) Intelligent recommendation method and device and computer equipment
CN114840387A (en) Micro-service monitoring method and device, computer equipment and storage medium
CN112036172A (en) Entity identification method and device based on abbreviated data of model and computer equipment
CN113986581A (en) Data aggregation processing method and device, computer equipment and storage medium
CN112637282B (en) Information pushing method and device, computer equipment and storage medium
JP6773678B2 (en) How to identify a user's interaction signature
CN114978968A (en) Micro-service anomaly detection method and device, computer equipment and storage medium
CN114090408A (en) Data monitoring and analyzing method and device, computer equipment and storage medium
CN113672654A (en) Data query method and device, computer equipment and storage medium
CN113282514A (en) Problem data processing method and device, computer equipment and storage medium
CN113327037A (en) Model-based risk identification method and device, computer equipment and storage medium
CN113051372A (en) Material data processing method and device, computer equipment and storage medium
CN113435990B (en) Certificate generation method and device based on rule engine and computer equipment
CN114036117A (en) Log viewing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination