CN111813694B - Test method, test device, electronic equipment and readable storage medium - Google Patents

Test method, test device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN111813694B
CN111813694B CN202010804100.5A CN202010804100A CN111813694B CN 111813694 B CN111813694 B CN 111813694B CN 202010804100 A CN202010804100 A CN 202010804100A CN 111813694 B CN111813694 B CN 111813694B
Authority
CN
China
Prior art keywords
performance test
input
requirement
target performance
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010804100.5A
Other languages
Chinese (zh)
Other versions
CN111813694A (en
Inventor
钟瑞
郑重
刘明磊
陈壮壮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202010804100.5A priority Critical patent/CN111813694B/en
Publication of CN111813694A publication Critical patent/CN111813694A/en
Application granted granted Critical
Publication of CN111813694B publication Critical patent/CN111813694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The present disclosure provides a test method, a test apparatus, an electronic device, and a readable storage medium. The test method comprises the following steps: displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements; according to the type of the target performance test requirement selected by the user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to achieve the target performance test requirements.

Description

Test method, test device, electronic equipment and readable storage medium
Technical Field
The present disclosure relates to the field of software performance testing, and more particularly, to a performance testing method, a performance testing apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of internet technology and electronic commerce, the experience of clients on software performance is higher and higher, and meanwhile, the performance test requirement on the software system in the financial industry is also higher and higher. In the testing process of the financial industry software system, requirements for quick delivery such as agility and iteration and the like and the ever-increasing system scale cannot achieve full coverage of testing requirements, and performance testing is required to screen and prioritize the requirements.
In the process of implementing the disclosed concept, the inventors found that at least the following problems exist in the related art: the related technology is used for testing the performance of the system, so that the testing efficiency is low, and time and labor are wasted.
Disclosure of Invention
In view of this, the present disclosure provides a performance testing method, a performance testing apparatus, an electronic device, and a computer-readable storage medium.
One aspect of the present disclosure provides a test method, which may include: displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements; according to the type of the target performance test requirement selected by the user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to achieve the target performance test requirements.
According to an embodiment of the present disclosure, wherein determining the feature value to be input and the element scalar to be input of the target performance test requirement according to the type of the target performance test requirement selected by the user in the requirement collection view includes: extracting a feature value to be input and an element scalar to be input which are associated with the type of the target performance test requirement from an element warehouse according to the type of the target performance test requirement, wherein the element warehouse comprises test elements respectively corresponding to the different types of the performance test requirements, and the test elements comprise feature value elements and element scalar elements; and generating a page input item comprising the feature value to be input and the element scalar to be input.
According to an embodiment of the present disclosure, the performance test method further includes: before a performance test case is generated according to characteristic value information and element scalar information input by a user aiming at the characteristic value to be input and the element scalar to be input, carrying out qualitative analysis on target performance test requirements according to the characteristic value information to obtain a qualitative analysis result; and generating a performance test case in the case that the qualitative analysis results indicate that the target performance test requirements are allowed to be executed.
According to an embodiment of the present disclosure, the performance test method further includes: under the condition that qualitative analysis results indicate that the execution of the target performance test requirements is allowed, quantitatively analyzing the target performance test requirements according to the element scalar information to obtain quantitative analysis results; calculating a risk score of the target performance test requirement according to the quantitative analysis result; and determining the execution sequence of the performance test cases of the target performance test requirements according to the risk scores.
According to an embodiment of the present disclosure, performing a quantitative analysis on a target performance test requirement according to element scalar information, and obtaining a quantitative analysis result includes: and comparing and analyzing the element scalar information with actual production operation data in the production operation database to determine whether the element scalar information is reasonable.
According to an embodiment of the present disclosure, the performance test method further includes: after the performance test cases are executed, the execution results are fed back to the user.
According to an embodiment of the present disclosure, wherein presenting the demand collection view includes: acquiring type information of a login user; and in the case that the login user is determined to be the performance test demander according to the type information of the login user, displaying the demand collection view.
Another aspect of the present disclosure provides a test apparatus, which may include: the display module is used for displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements; the first determining module is used for determining a feature value to be input and a factor scalar to be input of the target performance test requirement according to the type of the target performance test requirement selected by a user in the requirement collection view; the generating module is used for generating a performance test case according to the characteristic value information and the element scalar information input by the user aiming at the characteristic value to be input and the element scalar to be input; and the execution module is used for executing the performance test cases to realize the target performance test requirements.
According to an embodiment of the present disclosure, the first determining module includes: the system comprises an extraction unit, a storage unit and a control unit, wherein the extraction unit is used for extracting a feature value to be input and an element scalar to be input which are associated with the type of the target performance test requirement from an element warehouse according to the type of the target performance test requirement, the element warehouse comprises test elements respectively corresponding to the different types of the performance test requirements, and the test elements comprise feature value elements and element scalar elements; and a generation unit for generating a page input item including the feature value to be input and the element scalar to be input.
According to an embodiment of the present disclosure, the performance test apparatus further includes: the qualitative analysis module is used for carrying out qualitative analysis on the target performance test requirements according to the characteristic value information before generating the performance test case according to the characteristic value information and the element scalar information which are input by the user aiming at the characteristic value to be input and the element scalar to be input, so as to obtain a qualitative analysis result; and the generating module is used for generating a performance test case under the condition that the qualitative analysis result shows that the target performance test requirement is allowed to be executed.
According to an embodiment of the present disclosure, the performance test apparatus further includes: the quantitative analysis module is used for quantitatively analyzing the target performance test requirement according to the element scalar information under the condition that the qualitative analysis result shows that the target performance test requirement is allowed to be executed, so as to obtain a quantitative analysis result; the calculation module is used for calculating a risk score of the target performance test requirement according to the quantitative analysis result; and the second determining module is used for determining the execution sequence of the performance test cases of the target performance test requirements according to the risk scores.
According to an embodiment of the present disclosure, the quantitative analysis module is configured to: and comparing and analyzing the element scalar information with actual production operation data in the production operation database to determine whether the element scalar information is reasonable.
According to an embodiment of the present disclosure, the performance test apparatus further includes: and the feedback module is used for feeding back the execution result to the user after the performance test case is executed.
According to an embodiment of the present disclosure, the display module includes: the acquisition unit is used for acquiring type information of the login user; and the display unit is used for displaying the requirement collection view under the condition that the login user is determined to be a performance test requirement party according to the type information of the login user.
Another aspect of the present disclosure provides an electronic device, comprising: one or more processors; and a memory for storing one or more instructions, wherein the one or more instructions, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed, are configured to implement a method as above.
Another aspect of the present disclosure provides a computer program comprising computer executable instructions which when executed are for implementing a method as above.
According to the embodiment of the disclosure, according to the type of the target performance test requirement selected by a user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to realize target performance test requirements, thereby unifying channels and methods of the performance test requirements, saving communication cost caused by manual collection requirements and saving manpower; the method solves the problem that the requirements leak out and leak in caused by different personnel standards, has more definite reasons for allowing and returning to the requirement proposer, can be explained, is convenient for synchronous information of all parties, and avoids misunderstanding; the test cases are strongly associated with the test requirements, unified progress management, problem management and risk management are provided, and a grip is provided for a manager.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments thereof with reference to the accompanying drawings in which:
FIG. 1 schematically illustrates an exemplary system architecture to which the present disclosure may be applied for a performance testing method, testing apparatus, electronic device, and readable storage medium;
FIG. 2 schematically illustrates a flow chart of a test method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a flow chart of determining feature values to be input and element scalars to be input for a target performance test requirement according to a type of target performance test requirement selected by a user in a requirement collection view, according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart of a test method according to another embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart in the case where qualitative analysis results indicate that target performance test requirements are allowed to be performed, in accordance with an embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart showing a demand collection view, according to an embodiment of the present disclosure;
FIG. 7 schematically illustrates a flow chart of a performance testing method according to another embodiment of the present disclosure;
FIG. 8 schematically illustrates a block diagram of a test apparatus according to an embodiment of the disclosure;
FIG. 9 schematically illustrates a block diagram of a testing device according to another embodiment of the present disclosure; and
fig. 10 schematically illustrates a block diagram of a computer system suitable for implementing the above-described method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is only exemplary and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It should be noted that the terms used herein should be construed to have meanings consistent with the context of the present specification and should not be construed in an idealized or overly formal manner.
Where expressions like at least one of "A, B and C, etc. are used, the expressions should generally be interpreted in accordance with the meaning as commonly understood by those skilled in the art (e.g.," a system having at least one of A, B and C "shall include, but not be limited to, a system having a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a formulation similar to at least one of "A, B or C, etc." is used, in general such a formulation should be interpreted in accordance with the ordinary understanding of one skilled in the art (e.g. "a system with at least one of A, B or C" would include but not be limited to systems with a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
In the process of realizing the method, the device and the system find that the performance test needs to collect more sources, the characteristic values and the elements are not uniform, the needs of the demander and the testing party are often required to communicate and confirm in multiple rounds, the efficiency is low, a large amount of manpower is wasted, and the condition of information asymmetry easily occurs; the performance test requirements are largely the result of human discussions, lacking a quantifiable, interpretable decision process. Allowing the results of performance test requirements to lack convincing power to the requirement proposer, and lack quality control grips to the test manager; the requirements and the test cases are independent of each other, no association relation is established, the proposed requirements lack of tracking of execution stages, and the information of the requirements proposer and the test manager is opaque.
The present disclosure provides a test method, a test apparatus, an electronic device, and a readable storage medium. The test method comprises the following steps: displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements; according to the type of the target performance test requirement selected by the user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to achieve the target performance test requirements.
Fig. 1 schematically illustrates an exemplary system architecture 100 to which the performance testing method, testing apparatus, electronic device, and readable storage medium of the present disclosure may be applied. It should be noted that fig. 1 is only an example of a system architecture to which embodiments of the present disclosure may be applied to assist those skilled in the art in understanding the technical content of the present disclosure, but does not mean that embodiments of the present disclosure may not be used in other devices, systems, environments, or scenarios.
As shown in fig. 1, a system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired and/or wireless communication links, and the like.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications may be installed on the terminal devices 101, 102, 103, such as shopping class applications, web browser applications, search class applications, instant messaging tools, mailbox clients and/or social platform software, to name a few.
The terminal devices 101, 102, 103 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (e.g., the web page, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the performance test method provided by the embodiments of the present disclosure may be generally performed by the server 105. Accordingly, the performance testing apparatus provided by the embodiments of the present disclosure may be generally disposed in the server 105. The performance testing method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the performance testing apparatus provided by the embodiments of the present disclosure may also be provided in a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Alternatively, the performance test method provided by the embodiment of the present disclosure may be performed by the terminal device 101, 102, or 103, or may be performed by another terminal device other than the terminal device 101, 102, or 103. Accordingly, the performance test apparatus system provided by the embodiments of the present disclosure may also be provided in the terminal device 101, 102, or 103, or in another terminal device different from the terminal device 101, 102, or 103.
For example, a user stores a performance test requirement instance in any one of the terminal devices 101, 102 or 103 (for example, but not limited to, the terminal device 101) through page entry, file import and the like, displays a requirement collection view, determines a feature value to be input and an element scalar to be input of the target performance test requirement according to a type of the target performance test requirement selected by the user in the requirement collection view, generates a performance test case according to feature value information and element scalar information input by the user aiming at the feature value to be input and the element scalar to be input, and executes the performance test case to realize the target performance test requirement.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 schematically illustrates a flow chart of a test method according to an embodiment of the disclosure.
It should be noted that, unless there is an execution sequence between different operations or an execution sequence between different operations in technical implementation, the execution sequence between multiple operations may be different, and multiple operations may also be executed simultaneously in the embodiment of the disclosure.
As shown in fig. 2, the method includes operations S201 to S204.
In operation S201, a requirement collection view is displayed, wherein the requirement collection view includes options of different types of performance test requirements.
According to embodiments of the present disclosure, the show demand collection view may be an interface that the user can see after logging into the system for the user to select different demand types.
According to embodiments of the present disclosure, the options for performance test requirements may be different requirement types set for the test requirements of different users. For example, new functions, second kill activities, system upgrades, etc.
In operation S202, a feature value to be input and a factor scalar to be input of a target performance test requirement are determined according to a type of the target performance test requirement selected by a user in the requirement collection view.
According to embodiments of the present disclosure, the user may be, for example, a performance test demander.
According to embodiments of the present disclosure, the system may provide a relevant view of the requirement collection if the user is the performance test demander.
According to embodiments of the present disclosure, the target performance test requirements are one or more test requirement types to be tested provided by a user, and the inputs of the requirement elements are checked for legitimacy.
According to an embodiment of the present disclosure, the feature value is an objective classification in the performance test requirement classification. For example, whether it is a new online system, whether it is a guest system, transaction type, etc.
According to an embodiment of the present disclosure, the element scalar is a hierarchical classification of demand urgency in a performance test demand classification. For example, the system level, the daily transaction amount, the number of users, and the like may be mentioned.
In operation S203, a performance test case is generated according to feature value information and element scalar information input by the user for the feature value to be input and the element scalar to be input.
According to an embodiment of the present disclosure, the performance test case is a performance test case formed by a user inputting feature value information and element variable information into a system in a manner to generate a performance test requirement instance, and converting the performance test requirement element into a performance test case element to be combined into the performance test case. One way may be through page entry or, alternatively, may be through file importation.
In operation S204, performance test cases are executed to achieve the target performance test requirements.
According to embodiments of the present disclosure, executing a performance test case may associate performance test requirements with a performance test case through performance test transformation parameters. Wherein the performance test transformation parameters include computationally transforming the demand elements that are allowed to execute into performance test case elements. For example, it may be a test time, a response time, the number of concurrent users, etc.
According to the embodiment of the disclosure, according to the type of the target performance test requirement selected by a user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to realize target performance test requirements, thereby unifying channels and methods of the performance test requirements, saving communication cost caused by manual collection requirements and saving manpower; the method solves the problems of leakage and missing of the requirements caused by different personnel standards, has more definite reasons for allowing and returning to the requirement proposer, can explain, facilitates synchronous information of all parties and avoids misunderstanding; the test scene is strongly associated with the test requirement, unified progress management, problem management and risk management are provided, and a grip is provided for a manager.
The method illustrated in fig. 2 is further described below with reference to fig. 3 in conjunction with an exemplary embodiment.
FIG. 3 schematically illustrates a flow chart of determining feature values to be input and element scalars to be input for a target performance test requirement according to a type of target performance test requirement selected by a user in the requirement collection view, according to an embodiment of the present disclosure.
As shown in fig. 3, the method includes operations S301 to S302.
In operation S301, according to the type of the target performance test requirement, a feature value to be input and an element scalar to be input associated with the type of the target performance test requirement are extracted from an element repository, wherein the element repository includes test elements respectively corresponding to different types of performance test requirements, and the test elements include feature value elements and element scalar elements.
According to embodiments of the present disclosure, the element repository may include different test elements corresponding to all of the different demand types in the performance test demand.
In operation S302, a page entry including a feature value to be input and a scalar of an element to be input is generated.
According to embodiments of the present disclosure, the page entry may be a functional view of different requirements available for selection by the performance test demander. For example, the feature value information may be feature value information such as performance test requirements of a new function or performance test requirements of a second killing activity, or element scalar information such as the number of users, the daily transaction amount, and the delivery time.
According to the embodiment of the disclosure, the type and the elements of the performance test requirement can be standardized in the collecting process by inputting the target test requirement type, converting the input target test requirement type into the element combination, extracting the element combination from the element warehouse, and generating the elements of each requirement instance, so that a standard speaking system which can be understood by all participants is provided, and the problem of lack of control of the conventional requirement quality is solved.
Fig. 4 schematically illustrates a flow chart of a test method according to another embodiment of the present disclosure.
As shown in fig. 4, the method includes operations S401 to S402.
In operation S401, before generating a performance test case according to the feature value information and the element scalar information input by the user for the feature value to be input and the element scalar to be input, qualitative analysis is performed on the target performance test requirement according to the feature value information, so as to obtain a qualitative analysis result.
According to the embodiment of the disclosure, the qualitative analysis mainly obtains a qualitative conclusion on whether performance test requirements are allowed to be executed or not through a decision tree method according to the characteristic value information. For example, the feature value information may be whether the new online system is a guest system, whether the processing is synchronous or asynchronous, or the like.
According to an embodiment of the present disclosure, for example, assuming the performance test requirement is planned to be the present weekend, the pending branch is entered due to insufficient test time, but the requirement enters the allowed execution phase through the qualitative analysis module, entering the subsequent test due to the application system class a system (high risk to guests) and the class kill transaction for seconds.
In operation S402, in the case that the qualitative analysis result indicates that the target performance test requirement is allowed to be performed, a performance test case is generated.
According to embodiments of the present disclosure, the qualitative analysis results may include: allowing the target performance test to be performed and the target performance test to be returned. For example, if the performance test is returned to the target performance test, the performance test requirement is returned to the performance test requirement side, and the return reason is replied.
According to the embodiment of the disclosure, the performance test team can perform preliminary screening on the performance test requirement review in the qualitative analysis stage through qualitative analysis, so that the process of the performance test team review requirement is reduced, and the labor cost is saved.
The method illustrated in fig. 4 is further described below with reference to fig. 5 in conjunction with an exemplary embodiment.
Fig. 5 schematically illustrates a flow chart in the case where qualitative analysis results indicate that the target performance test requirements are allowed to be performed, according to an embodiment of the present disclosure.
As shown in fig. 5, the method includes operations S501 to S503.
In operation S501, in the case where the qualitative analysis result indicates that the execution of the target performance test requirement is permitted, the target performance test requirement is quantitatively analyzed according to the element scalar information, and a quantitative analysis result is obtained.
According to an embodiment of the present disclosure, the quantitative analysis is to determine whether the performance test requirement is allowed to be executed or not by quantitatively comparing the related element scalar with the actual production operation data in the production operation database according to the element scalar in the requirement element, and to evaluate the rationality and importance of the requirement, and to enter the subsequent test. The element scalar may be the number of users, daily transaction amount, data amount, delivery date, etc.
In operation S502, a risk score of the target performance test requirement is calculated according to the quantitative analysis result.
According to an embodiment of the present disclosure, the quantitative analysis result may include: allowing the target performance test to be performed and the target performance test to be returned. For example, if the performance test is returned to the target performance test, the performance test requirement is returned to the performance test requirement side, and the return reason is replied.
According to an embodiment of the present disclosure, calculating the risk score of the target performance test requirement may be performing risk assessment on the input of the input requirement instance to calculate the risk score of the requirement. In particular, the algorithm of calculation may comprise setting weights according to each element, the weights ranging from-10 to 10, the risk score for each demand being equal to the sum of the element scalar x weights. The factors may include application risk level, delivery date, number of users, daily transaction amount, etc.
According to an embodiment of the present disclosure, for example, a risk score for the demand of the class a application system (for a guest high risk system) is calculated, the demand risk score being equal to 100-1000/(100×10+100000×0.1+2000000×0.01) =99.7, the risk score for the demand is recorded as 99.7 (percentile), and subsequent tests are performed.
In operation S503, the execution order of the performance test cases of the target performance test requirements is determined according to the risk scores.
According to the embodiment of the disclosure, the execution sequence is ranked according to the priority of the scores according to the result of the risk degree analysis, and the performance test requirements allowing the performance test with the front risk score to be executed and the rest returned to the performance test requirement party are determined according to the resources which can be input during the evaluation of the performance test team, and the reasons of refusal are replied.
According to the embodiment of the disclosure, under the condition that qualitative analysis results show that the target performance test requirement is allowed to be executed, the performance test requirement is automatically analyzed according to the integrated expert experience and data statistics analysis method by combining quantitative analysis, and the requirement leakage and leakage caused by inconsistent standards of different personnel are solved as the basis for allowing the execution of the test requirement. Meanwhile, the reasons for allowing performance test requesters to execute and return can be synchronized to each party, and explanation is convenient.
According to an embodiment of the present disclosure, performing a quantitative analysis on a target performance test requirement according to element scalar information, and obtaining a quantitative analysis result includes: and comparing and analyzing the element scalar information with actual production operation data in the production operation database to determine whether the element scalar information is reasonable.
According to embodiments of the present disclosure, the production operation database may include operation data for synchronizing the application systems from the production environment every half month.
According to the embodiment of the disclosure, the actual production operation data in the production operation database may include indexes such as daily average transaction amount, data amount, number of users, delivery date, etc., which are used for checking the deviation degree of key elements input by the users on one hand, and continuously adjusting the comparative analysis model according to the production operation condition on the other hand.
According to the embodiment of the disclosure, for example, according to statistics of a production operation database, actual transaction amount and user number in the production operation database of the passenger high risk system are compared with the number of participating activity users and activity transaction amount in a performance test requirement example, estimated indexes in the requirement example are 5 times of the current operation indexes, and the estimated indexes belong to relatively large deviations, but transaction increases lower than 10 times are in a reasonable range due to the fact that the transactions are killed in seconds, the requirement is not directly refused, an analysis result is confirmed by a performance test requirement party, and the performance test requirement party confirms the change of the transaction amount. The requirement allows performance testing to be performed through the quantitative comparison analysis module, and the risk assessment stage is entered.
According to an embodiment of the present disclosure, the above method further includes: after the performance test cases are executed, the execution results are fed back to the user.
According to the embodiment of the disclosure, the performance test cases are obtained by converting the performance test requirements allowed to be executed into the elements of the performance test cases according to element scalar, generating corresponding requirement performance test cases after conversion, and outputting the requirement performance test cases to a performance test implementation stage. The elements of the performance test cases may include online users, transactions, branches, throughput, response time, etc.
According to an embodiment of the disclosure, for example, the daily average transaction amount of the transaction is 2000000, the transaction time is 9:00-17:00 on weekdays, the peak throughput is estimated according to 80% of the transaction amount concentrated at 20% of the transaction time, and the throughput required to be supported by the transaction is (2000000×80%)/60×60×20% =278.
According to the embodiment of the disclosure, when the performance test case is executed, the performance test requirement is converted into the performance test case, and the performance test requirement is associated with the test case and can feed back the progress, risk and result in the implementation process for the user.
According to the embodiment of the disclosure, the qualitative analysis and the quantitative analysis are combined, so that the process of evaluating the requirements of the performance test team can be reduced, and the labor is saved. The performance test requirements are evaluated through the solidified expert rules, and the problem that the requirements leak out and leak in caused by different personnel standards is solved. In addition, the reasons for the admission and the return of the demand proposer are more clear, and the method can be used for explaining, so that the synchronous information of all parties is convenient, and misunderstanding is avoided.
The method illustrated in fig. 2 is further described below with reference to fig. 6 in conjunction with an exemplary embodiment.
Fig. 6 schematically illustrates a flow chart showing a demand collection view according to an embodiment of the present disclosure.
As shown in fig. 6, the method includes operations S601 to S602.
In operation S601, type information of a login user is acquired.
According to embodiments of the present disclosure, the user may be a performance test demander or, alternatively, a performance test team.
According to the embodiment of the disclosure, the type information of the login user can be obtained through a coding program of user account information preset in the system, and the user obtains whether the login user is a performance test demand party or a performance test team party through logging in own system account.
According to embodiments of the present disclosure, if the user is a performance test demander, the system may provide a relevant view showing the collection of requirements; if the user is a performance test team, the system may provide a view of whether the requirements allow execution, the requirements answer.
In operation S602, in a case where it is determined that the login user is a performance test demander according to the type information of the login user, a demand collection view is displayed.
According to the embodiment of the disclosure, if the login user information is a performance test requirement party, the system provides a functional view of the service promotion class for the performance test requirement party to select. The service promotion class may include new functions, second kill activity, system upgrades, etc.
Fig. 7 schematically illustrates a flow chart of a performance testing method according to another embodiment of the present disclosure.
As shown in fig. 7, the method includes operations S701 to S708.
In operation S701, a performance test demander logs in to a system to select performance test requirements of a service promotion class in a display requirement collection view.
In operation S702, performance test requirements are submitted in accordance with the input provided by the requirement type. Including application level, activity type, activity planning time, number of users engaged in activity, and amount of activity transactions.
After the demand is submitted, qualitative analysis is performed by means of a decision tree in operation S703. If the analysis result does not pass, returning the performance test requirement party and replying a refusal reason; and if the analysis result passes, outputting the demand instance to the latter item.
In operation S704, the input demand enters a comparative analysis model of quantitative analysis, and the related element scalar is compared with the actual production operation data of the production operation database according to statistics in the production operation database, so as to evaluate the rationality and importance of the demand. If the test result does not pass, returning the performance test requirement party, and replying the reason; if so, the passed demand is output to the post.
In operation S705, the inputted demand enters a quantitative risk assessment, and a risk score for the demand is calculated from the element variables in the demand instance.
In operation S706, the risk scores calculated in the risk degree analysis are ranked, and the performance test team determines whether the execution of the requirement is allowed according to the quantitative index of the priority of the requirement according to the resource investment condition. The high priority requirements allow execution, the remaining non-allowed requirements return to the performance test demander and answer the reason.
In operation S707, the requirements for execution are converted into elements of the performance test case according to the element variables. Elements may include transactions, branches, throughput, number of users, response time, etc.
In operation S708, the converted requirements are composed into performance test cases, which are output to the performance test implementation stage.
According to the embodiment of the disclosure, according to the type of the target performance test requirement selected by a user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to realize target performance test requirements, thereby unifying channels and methods of the performance test requirements, saving communication cost caused by manual collection requirements and saving manpower; the method solves the problems of leakage and missing of the requirements caused by different personnel standards, has more definite reasons for allowing and returning to the requirement proposer, can explain, facilitates synchronous information of all parties and avoids misunderstanding; the test scene is strongly associated with the test requirement, unified progress management, problem management and risk management are provided, and a grip is provided for a manager.
Fig. 8 schematically illustrates a block diagram of a test apparatus according to an embodiment of the present disclosure.
As shown in fig. 8, the test apparatus 800 includes a presentation module 810, a first determination module 820, a generation module 830, and an execution module 840.
The display module 810 is configured to display a requirement collection view, where the requirement collection view includes options for different types of performance test requirements.
The first determining module 820 is configured to determine a feature value to be input and a scalar of an element to be input of the target performance test requirement according to a type of the target performance test requirement selected by a user in the requirement collection view.
The generating module 830 is configured to generate a performance test case according to feature value information and element scalar information input by a user for the feature value to be input and the element scalar to be input.
The execution module 840 is configured to execute the performance test cases to achieve the target performance test requirements.
According to the embodiment of the disclosure, according to the type of the target performance test requirement selected by a user in the requirement collection view, determining a feature value to be input and a factor scalar to be input of the target performance test requirement; generating a performance test case according to feature value information and element scalar information which are input by a user aiming at the feature value to be input and the element scalar to be input; and executing the performance test cases to realize target performance test requirements, thereby unifying channels and methods of the performance test requirements, saving communication cost caused by manual collection requirements and saving manpower; the method solves the problem that the requirements leak out and leak in caused by different personnel standards, has more definite reasons for allowing and returning to the requirement proposer, can be explained, is convenient for synchronous information of all parties, and avoids misunderstanding; the test scene and the test requirement are strongly associated to provide unified progress management, problem management and risk management, and a grip is provided for a manager.
According to an embodiment of the present disclosure, the first determining module 820 includes: extraction unit, generating unit.
The extraction unit is used for extracting a feature value to be input and an element scalar to be input which are associated with the type of the target performance test requirement from the element warehouse according to the type of the target performance test requirement, wherein the element warehouse comprises test elements respectively corresponding to the different types of the performance test requirements, and the test elements comprise feature value elements and element scalar elements.
The generation unit is used for generating a page input item comprising the feature value to be input and the element scalar to be input.
Fig. 9 schematically illustrates a block diagram of a testing device according to another embodiment of the present disclosure.
As shown in fig. 9, the test apparatus 800' may further include a qualitative analysis module 850, a quantitative analysis module 860, a calculation module 870, a second determination module 880, and a feedback module 890 in addition to the presentation module 810, the first determination module 820, the generation module 830, and the execution module 840.
According to an embodiment of the present disclosure, the qualitative analysis module 850 is configured to perform qualitative analysis on the target performance test requirement according to the feature value information before generating the performance test case according to the feature value information and the element scalar information input by the user for the feature value to be input and the element scalar to be input, so as to obtain a qualitative analysis result; and a generation module 830 for generating a performance test case if the qualitative analysis results indicate that the target performance test requirements are allowed to be performed.
According to an embodiment of the present disclosure, the quantitative analysis module 860 is configured to quantitatively analyze the target performance test requirement according to the element scalar information to obtain a quantitative analysis result, where the qualitative analysis result indicates that the target performance test requirement is allowed to be executed.
According to an embodiment of the present disclosure, the calculation module 870 is configured to calculate a risk score for the target performance test requirement based on the quantitative analysis results.
According to an embodiment of the present disclosure, the second determining module 880 is configured to determine an execution order of the performance test cases of the target performance test requirement according to the risk score.
According to an embodiment of the present disclosure, the quantitative analysis module 860 is configured to compare the element scalar information with actual production operation data in the production operation database to determine whether the element scalar information is reasonable.
According to embodiments of the present disclosure, the feedback module 890 is configured to feedback the execution results to the user after the performance test cases are executed.
According to an embodiment of the present disclosure, wherein the presentation module 810 includes: the device comprises an acquisition unit and a display unit.
The acquisition unit is used for acquiring the type information of the login user.
The display unit is used for displaying the requirement collection view under the condition that the login user is determined to be a performance test requirement party according to the type information of the login user.
Any number of modules, sub-modules, units, sub-units, or at least some of the functionality of any number of the sub-units according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented as split into multiple modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system-on-chip, a system-on-substrate, a system-on-package, an Application Specific Integrated Circuit (ASIC), or in any other reasonable manner of hardware or firmware that integrates or encapsulates the circuit, or in any one of or a suitable combination of three of software, hardware, and firmware. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be at least partially implemented as computer program modules, which when executed, may perform the corresponding functions.
For example, any number of presentation module 810, first determination module 820, generation module 830, execution module 840, qualitative analysis module 850, quantitative analysis module 860, calculation module 870, second determination module 880, feedback module 890 may be combined in one module/unit/sub-unit to be implemented, or any one of the modules/units/sub-units may be split into multiple modules/units/sub-units. Alternatively, at least some of the functionality of one or more of these modules/units/sub-units may be combined with at least some of the functionality of other modules/units/sub-units and implemented in one module/unit/sub-unit. According to embodiments of the present disclosure, at least one of the presentation module 810, the first determination module 820, the generation module 830, the execution module 840, the qualitative analysis module 850, the quantitative analysis module 860, the calculation module 870, the second determination module 880, the feedback module 890 may be implemented at least in part as hardware circuitry, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging circuitry, or in any one of or a suitable combination of three of software, hardware, and firmware. Alternatively, at least one of the presentation module 810, the first determination module 820, the generation module 830, the execution module 840, the qualitative analysis module 850, the quantitative analysis module 860, the calculation module 870, the second determination module 880, the feedback module 890 may be at least partially implemented as a computer program module, which, when executed, may perform the corresponding functions.
It should be noted that, in the embodiment of the present disclosure, the performance testing apparatus corresponds to the performance testing method portion in the embodiment of the present disclosure, and the description of the performance testing apparatus portion specifically refers to the performance testing method portion, which is not described herein.
Another aspect of the present disclosure provides an electronic device, comprising: one or more processors; and a memory for storing one or more instructions, wherein the one or more instructions, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
The following describes an electronic device as an example of a computer system.
Fig. 10 schematically illustrates a block diagram of a computer system suitable for implementing the above-described method according to an embodiment of the present disclosure. The computer system illustrated in fig. 10 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 10, a computer system 1000 according to an embodiment of the present disclosure includes a processor 1001 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a Random Access Memory (RAM) 1003. The processor 1001 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or an associated chipset and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. The processor 1001 may also include on-board memory for caching purposes. The processor 1001 may include a single processing unit or multiple processing units for performing different actions of the method flows according to embodiments of the present disclosure.
In the RAM 1003, various programs and data required for the operation of the system 1000 are stored. The processor 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. The processor 1001 performs various operations of the method flow according to the embodiment of the present disclosure by executing programs in the ROM 1002 and/or the RAM 1003. Note that the program may be stored in one or more memories other than the ROM 1002 and the RAM 1003. The processor 1001 may also perform various operations of the method flow according to the embodiments of the present disclosure by executing programs stored in the one or more memories.
According to embodiments of the present disclosure, system 1000 may also include an input/output (I/O) interface 1005, with input/output (I/O) interface 1005 also connected to bus 1004. The system 1000 may also include one or more of the following components connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output portion 1007 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), etc., and a speaker, etc.; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to the I/O interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed as needed in the drive 1010, so that a computer program read out therefrom is installed as needed in the storage section 1008.
According to embodiments of the present disclosure, the method flow according to embodiments of the present disclosure may be implemented as a computer software program. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. The above-described functions defined in the system of the embodiments of the present disclosure are performed when the computer program is executed by the processor 1001. The systems, devices, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
The present disclosure also provides a computer-readable storage medium that may be embodied in the apparatus/device/system described in the above embodiments; or may exist alone without being assembled into the apparatus/device/system. The computer-readable storage medium carries one or more programs which, when executed, implement methods in accordance with embodiments of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, the computer-readable storage medium may include ROM 1002 and/or RAM 1003 and/or one or more memories other than ROM 1002 and RAM 1003 described above.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. Those skilled in the art will appreciate that the features recited in the various embodiments of the disclosure and/or in the claims may be combined in various combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the disclosure. In particular, the features recited in the various embodiments of the present disclosure and/or the claims may be variously combined and/or combined without departing from the spirit and teachings of the present disclosure. All such combinations and/or combinations fall within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the disclosure, and such alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (10)

1. A method of testing, comprising:
displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements;
determining a feature value to be input and a factor scalar to be input of the target performance test requirement according to the type of the target performance test requirement selected by a user in the requirement collection view;
generating a performance test case according to the characteristic value information and the element scalar information which are input by the user aiming at the characteristic value to be input and the element scalar to be input; and
executing the performance test case to achieve the target performance test requirement;
Wherein the method further comprises:
before a performance test case is generated according to the characteristic value information and the element scalar information input by the user aiming at the characteristic value to be input and the element scalar to be input, carrying out qualitative analysis on the target performance test requirement according to the characteristic value information to obtain a qualitative analysis result; and
the performance test case is generated in case the qualitative analysis results indicate that the target performance test requirements are allowed to be performed.
2. The method of claim 1, wherein the determining the feature value to be input and the element scalar to be input for the target performance test requirement according to the type of target performance test requirement selected by the user in the requirement collection view comprises:
extracting a feature value to be input and an element scalar to be input which are associated with the type of the target performance test requirement from an element warehouse according to the type of the target performance test requirement, wherein the element warehouse comprises test elements respectively corresponding to different types of performance test requirements, and the test elements comprise feature value elements and element scalar elements; and
and generating a page input item comprising the feature value to be input and the element scalar to be input.
3. The method of claim 1, further comprising:
under the condition that the qualitative analysis result shows that the target performance test requirement is allowed to be executed, quantitatively analyzing the target performance test requirement according to the element scalar information to obtain a quantitative analysis result; and
calculating a risk score of the target performance test requirement according to the quantitative analysis result; and
and determining the execution sequence of the performance test cases of the target performance test requirements according to the risk scores.
4. The method of claim 3, wherein quantitatively analyzing the target performance test requirements based on the element scalar information, the quantitatively analyzing results comprising:
and comparing and analyzing the element scalar information with actual production operation data in a production operation database to determine whether the element scalar information is reasonable.
5. The method of claim 1 or 2, further comprising:
after the performance test cases are executed, execution results are fed back to the user.
6. The method of claim 1 or 2, wherein the presentation demand collection view comprises:
acquiring type information of a login user; and
And under the condition that the login user is determined to be a performance test demander according to the type information of the login user, displaying the requirement collection view.
7. A test apparatus comprising:
the display module is used for displaying a requirement collection view, wherein the requirement collection view comprises options of different types of performance test requirements;
the first determining module is used for determining a feature value to be input and a factor scalar to be input of the target performance test requirement according to the type of the target performance test requirement selected by a user in the requirement collection view;
the generating module is used for generating a performance test case according to the characteristic value information and the element scalar information which are input by the user aiming at the characteristic value to be input and the element scalar to be input; and
the execution module is used for executing the performance test case to realize the target performance test requirement;
wherein the apparatus further comprises:
the qualitative analysis module is used for carrying out qualitative analysis on the target performance test requirements according to the characteristic value information before generating a performance test case according to the characteristic value information and the element scalar information which are input by the user aiming at the characteristic value to be input and the element scalar to be input, so as to obtain a qualitative analysis result; and
The generation module is used for generating the performance test case under the condition that the qualitative analysis result shows that the target performance test requirement is allowed to be executed.
8. The apparatus of claim 7, further comprising:
the quantitative analysis module is used for quantitatively analyzing the target performance test requirement according to the element scalar information under the condition that the qualitative analysis result indicates that the target performance test requirement is allowed to be executed, so as to obtain a quantitative analysis result; and
the calculation module is used for calculating the risk score of the target performance test requirement according to the quantitative analysis result; and
and the second determining module is used for determining the execution sequence of the performance test cases of the target performance test requirements according to the risk scores.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more instructions,
wherein the one or more instructions, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1 to 6.
10. A computer readable storage medium having stored thereon executable instructions which when executed by a processor cause the processor to implement the method of any of claims 1 to 6.
CN202010804100.5A 2020-08-11 2020-08-11 Test method, test device, electronic equipment and readable storage medium Active CN111813694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010804100.5A CN111813694B (en) 2020-08-11 2020-08-11 Test method, test device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010804100.5A CN111813694B (en) 2020-08-11 2020-08-11 Test method, test device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111813694A CN111813694A (en) 2020-10-23
CN111813694B true CN111813694B (en) 2024-03-15

Family

ID=72858969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010804100.5A Active CN111813694B (en) 2020-08-11 2020-08-11 Test method, test device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111813694B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493793A (en) * 2009-02-19 2009-07-29 深圳市紫金支点技术股份有限公司 ATM test method and device
CN102063376A (en) * 2011-02-16 2011-05-18 哈尔滨工程大学 Test case selection method
CN102591779A (en) * 2012-01-12 2012-07-18 王轶辰 Establishing method for workflow-based universal software testing process model
CN108446231A (en) * 2018-03-19 2018-08-24 重庆邮电大学 A kind of testing protocol consistency use-case priority ordering method based on risk analysis
CN108845933A (en) * 2018-05-24 2018-11-20 广东睿江云计算股份有限公司 The method and device that software test case is write and evaluated
CN110321284A (en) * 2019-06-03 2019-10-11 平安科技(深圳)有限公司 Test data input method, device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10740382B2 (en) * 2017-06-20 2020-08-11 International Business Machines Corporation Sentiment analysis as a quality indicator in usability testing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493793A (en) * 2009-02-19 2009-07-29 深圳市紫金支点技术股份有限公司 ATM test method and device
CN102063376A (en) * 2011-02-16 2011-05-18 哈尔滨工程大学 Test case selection method
CN102591779A (en) * 2012-01-12 2012-07-18 王轶辰 Establishing method for workflow-based universal software testing process model
CN108446231A (en) * 2018-03-19 2018-08-24 重庆邮电大学 A kind of testing protocol consistency use-case priority ordering method based on risk analysis
CN108845933A (en) * 2018-05-24 2018-11-20 广东睿江云计算股份有限公司 The method and device that software test case is write and evaluated
CN110321284A (en) * 2019-06-03 2019-10-11 平安科技(深圳)有限公司 Test data input method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111813694A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN112884092B (en) AI model generation method, electronic device, and storage medium
CN110378749B (en) Client similarity evaluation method and device, terminal equipment and storage medium
CN108256810B (en) Insurance business approval process processing method and device
CN111340558B (en) Online information processing method, device, equipment and medium based on federal learning
CN108492005B (en) Project data processing method and device, computer equipment and storage medium
CN111476460B (en) Method, equipment and medium for intelligent operation scheduling of self-service equipment of bank
KR102624075B1 (en) Method, apparatus and program for matching procurement bidding information
CN111813694B (en) Test method, test device, electronic equipment and readable storage medium
CN117076280A (en) Policy generation method and device, electronic equipment and computer readable storage medium
KR101927273B1 (en) Risk value evaluating system for unclaimed construction and risk value evaluating apparatus for unclaimed construction
CN115204733A (en) Data auditing method and device, electronic equipment and storage medium
CN110363394B (en) Wind control service method and device based on cloud platform and electronic equipment
CN113052509A (en) Model evaluation method, model evaluation apparatus, electronic device, and storage medium
CN114331446A (en) Method, device, equipment and medium for realizing out-of-chain service of block chain
CN113034295A (en) Dangerous species recommendation method and device, electronic equipment and storage medium
CN113159537A (en) Evaluation method and device for new technical project of power grid and computer equipment
US20200097870A1 (en) Work task commitment manager
WO2020164283A1 (en) Test equipment sharing method and device, storage medium and computer equipment
CN111045935A (en) Automatic version auditing method, device, equipment and storage medium
CN111027866B (en) Product model construction method, device, electronic equipment and medium
US20230041328A1 (en) System and method for dynamic digital survey channel selection
CN117931673A (en) Test method, test device, test apparatus, test medium, and test program product
CN112000862A (en) Data processing method and device
CN116484097A (en) Object recommendation method, object recommendation device, electronic equipment and storage medium
CN115147195A (en) Bidding purchase risk monitoring method, apparatus, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant