CN116185815B - Software performance test simulation method and system - Google Patents

Software performance test simulation method and system Download PDF

Info

Publication number
CN116185815B
CN116185815B CN202211438673.6A CN202211438673A CN116185815B CN 116185815 B CN116185815 B CN 116185815B CN 202211438673 A CN202211438673 A CN 202211438673A CN 116185815 B CN116185815 B CN 116185815B
Authority
CN
China
Prior art keywords
software
performance test
evaluation
credibility
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211438673.6A
Other languages
Chinese (zh)
Other versions
CN116185815A (en
Inventor
谢耘
张运春
温胤鑫
李京华
张春林
刘玉连
章鹏
陈心航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tongtech Co Ltd
Original Assignee
Beijing Tongtech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tongtech Co Ltd filed Critical Beijing Tongtech Co Ltd
Priority to CN202211438673.6A priority Critical patent/CN116185815B/en
Publication of CN116185815A publication Critical patent/CN116185815A/en
Application granted granted Critical
Publication of CN116185815B publication Critical patent/CN116185815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides a software performance test simulation method and a system, wherein the method comprises the following steps: step 1: predicting the software use extreme situation of the target software after being online; step 2: simulating and configuring a software performance test environment based on software use extreme conditions; step 3: comparing with a preset software performance test template, and performing software performance test on target software in a software performance test environment to obtain a software performance test result; step 4: and outputting a software performance test result. According to the software performance test simulation method and system, the software use extreme situation after the software is online is predicted, the software performance test environment simulation configuration is carried out based on the software use extreme situation, the software performance test is carried out in the software performance test environment, whether the software can meet the use requirement or not is tested according to the extreme situation possibly encountered after the software is online, the pertinence and the rationality of the software performance test environment configuration are improved, and the software performance test is prevented from being insufficient or excessive.

Description

Software performance test simulation method and system
Technical Field
The application relates to the technical field of software performance test, in particular to a software performance test simulation method and system.
Background
Currently, software performance tests are performed before the software is brought on-line [ on-shelf application store ] [ for example: software pressure test, load test and the like, debugging and optimizing are carried out on the software based on the software performance test result, and the use stability of the software after being on line is ensured.
However, when performing a software performance test on software, a tester configures a software performance test environment according to his own experience [ for example: similar configuration is performed according to the software performance test environment of the conventional software, or the software performance test environment is directly configured as an extreme environment [ for example: the software is continuously pressurized to the greatest extent and bears the greatest load workload, the lack of pertinence and rationality of the configuration of the software performance test environment may cause insufficient or excessive software performance test, the insufficient software performance test may cause the influence on the use stability of the software after being on line, and the excessive software performance test may cause the waste of test resources and resources for debugging and optimizing the software after the test.
Thus, a solution is needed.
Disclosure of Invention
The application provides a software performance test simulation method and a system, which are used for predicting the software use extreme situation after software is online, carrying out software performance test environment simulation configuration based on the software use extreme situation, carrying out software performance test in a software performance test environment, and testing whether software can meet the use requirement according to the extreme situation possibly encountered after the software is online, thereby improving the pertinence and rationality of the software performance test environment configuration and avoiding the insufficient or excessive software performance test.
The application provides a software performance test simulation method, which comprises the following steps:
step 1: predicting the software use extreme situation of the target software after being online;
step 2: simulating and configuring a software performance test environment based on the software use extreme conditions;
step 3: comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
step 4: and outputting the software performance test result.
Preferably, the step 1: predicting software use extremes after target software is online, including:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
and integrating each target situation to obtain the software use extreme situation after the target software is online.
Preferably, the step 2: based on the software use extreme, simulating a configuration software performance test environment, including:
acquiring a preset test environment simulation configuration file generation template corresponding to each situation type;
generating a template based on the test environment simulation configuration file, and generating a test environment simulation configuration file according to the target condition corresponding to the condition type;
acquiring a preset initial test environment;
based on each test environment simulation configuration file, carrying out one-to-one test environment simulation configuration on the initial test environment;
and taking the initial test environment after the test environment simulation configuration is completed as a software performance test environment.
Preferably, the step 3: comparing with a preset software performance test template, performing software performance test on the target software in the software performance test environment to obtain a software performance test result, wherein the software performance test result comprises:
analyzing a plurality of groups of software performance test evaluation bases corresponding to each other in the software performance test templates to obtain sub templates and software performance test evaluation sub templates;
comparing each software performance test evaluation basis with an acquisition sub-template to acquire a software performance test evaluation basis of the target software in the software performance test environment;
comparing the corresponding software performance test evaluation sub-templates, and performing software performance test evaluation according to the software performance test evaluation basis to obtain a software performance test evaluation result;
and integrating each software performance test evaluation result to obtain a software performance test result.
Preferably, the step 4: outputting the software performance test result, including:
acquiring a preset terminal node corresponding to the development group of the target software;
and delivering the software performance test result to the terminal node.
Preferably, the historical software use extreme case after the other software meeting the software requirement is historically online is obtained from the big data platform, which comprises the following steps:
acquiring information sources and receiving time stamps of the use extreme cases of the historical software from a big data platform;
acquiring historical first credibility of the information source corresponding to the receiving time stamp;
if the first historical reliability is greater than or equal to a preset historical reliability threshold, acquiring a source category of the information source corresponding to the big data platform, wherein the source category comprises: active and passive sources;
when the source category of the information source is an active source, acquiring the use extreme condition of the historical software;
and when the source type of the information source is a passive source, acquiring a second credibility of the information source corresponding to the passive intention of the history software use extreme case, and acquiring the history software use extreme case if the second credibility is greater than or equal to a preset credibility threshold value.
Preferably, obtaining the historical first confidence that the information source corresponds to the receiving timestamp includes:
acquiring a plurality of evaluation records of the big data platform for performing credible evaluation on the information source in preset time before and after the receiving time stamp;
inquiring a preset time difference weight table, and determining a time difference weight corresponding to the time difference between the record time stamp of each evaluation record and the receiving time stamp;
comparing a preset evaluation record evaluation determination template, performing evaluation determination on the evaluation record to obtain an evaluation value, and giving the time difference weight corresponding to the evaluation value to obtain a first target value;
and accumulating and calculating each first target value to obtain historical first credibility of the information source corresponding to the receiving time stamp.
Preferably, obtaining a second confidence level that the information source corresponds to a passive intention of the historical software usage extremity comprises:
resolving intent weights of a plurality of sub-intents in the passive intent;
querying a preset intention credibility determining experience library, and determining credibility determining experiences corresponding to each sub intention;
generating a template based on a preset credibility determination rule, and generating a credibility determination rule according to the credibility determination experience;
based on the credibility determining rule, carrying out credibility determination on the corresponding sub intention to obtain a determined value, and giving intention weight corresponding to the determined value to obtain a second target value;
and accumulating and calculating each second target value to obtain second credibility of the passive intention.
The application provides a software performance test simulation system, which comprises:
the prediction module is used for predicting the software use extreme situation of the target software after being online;
the configuration module is used for simulating and configuring a software performance test environment based on the software use extreme conditions;
the test module is used for comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
and the output module is used for outputting the software performance test result.
Preferably, the predicting module predicts an extreme case of software use after the target software is online, including:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
and integrating each target situation to obtain the software use extreme situation after the target software is online.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the application is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, serve to explain the application. In the drawings:
FIG. 1 is a flow chart of a software performance test simulation method in an embodiment of the application;
FIG. 2 is a schematic diagram of a software performance test simulation system according to an embodiment of the present application.
Detailed Description
The preferred embodiments of the present application will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present application only, and are not intended to limit the present application.
The application provides a software performance test simulation method, as shown in figure 1, comprising the following steps:
step 1: predicting the software use extreme situation of the target software after being online;
step 2: simulating and configuring a software performance test environment based on the software use extreme conditions;
step 3: comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
step 4: and outputting the software performance test result.
The working principle and the beneficial effects of the technical scheme are as follows:
software use extremes include: software-borne extreme traffic when multiple software users use the software after the software is online [ e.g.: maximum number of concurrent users, maximum load capacity, maximum number of function requests, etc.). And simulating and configuring a software performance test environment based on the software using extreme conditions, comparing the software performance test environment with a software performance test template, performing software performance test on target software in the software performance test environment, and outputting a software performance test result.
According to the method, the software use extreme situation after the software is online is predicted, the software performance test environment simulation configuration is carried out based on the software use extreme situation, the software performance test is carried out in the software performance test environment, whether the software can meet the use requirement or not is tested according to the extreme situation which is possibly met actually after the software is online, the pertinence and the rationality of the software performance test environment configuration are improved, and the situation that the software performance test is insufficient or excessive is avoided.
In one embodiment, the step 1: predicting software use extremes after target software is online, including:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
and integrating each target situation to obtain the software use extreme situation after the target software is online.
The working principle and the beneficial effects of the technical scheme are as follows:
the prediction of the software use extreme after the target software is online may be based on historical software use extreme after other software is historically online. However, due to different audience groups, popularization channels and the like of different software, the software use extreme conditions after the software is online are different. Therefore, according to the software information of the target software [ audience group, popularization channel, function distribution, application scene, etc. ], the software requirement [ for example: the information matching degree of audience crowd of other software and audience crowd of target software is more than or equal to 90%, the information matching degree of popularization channel of other software and popularization channel of target software is more than or equal to 95%, and the other software meets the software requirement. The prediction accuracy according to the acquired accuracy is improved to a great extent, and the prediction accuracy of the software use extreme case after the target software is on line is improved.
In addition, after the history software use extreme cases are obtained, the situation that the same situation type (concurrent user number, load capacity, function request number and the like) exists, and a plurality of history software use extreme cases need to be reasonably screened, so that only one history software use extreme case of each situation type is ensured. The method comprises the steps of classifying the using extreme cases of historical software to obtain a situation set of a plurality of situation types, introducing preset situation screening rules corresponding to the situation types for testers to determine according to requirements, generally, setting the situation screening rules to be the maximum value in a selected situation set for ensuring sufficient software performance test, screening out target situations, combining the target situations into the software using extreme cases of the target software after the target software is on line, completing prediction, and improving data processing efficiency.
In one embodiment, the step 2: based on the software use extreme, simulating a configuration software performance test environment, including:
acquiring a preset test environment simulation configuration file generation template corresponding to each situation type;
generating a template based on the test environment simulation configuration file, and generating a test environment simulation configuration file according to the target condition corresponding to the condition type;
acquiring a preset initial test environment;
based on each test environment simulation configuration file, carrying out one-to-one test environment simulation configuration on the initial test environment;
and taking the initial test environment after the test environment simulation configuration is completed as a software performance test environment.
The working principle and the beneficial effects of the technical scheme are as follows:
the test environment simulation configuration file generates a script file with a template as parameters to be set. Generating a template based on the test environment simulation configuration file, and generating a test environment simulation configuration file according to the target situation of the corresponding situation type [ for example: the scenario type is the number of concurrent users 350, the script file is a script simulating the same or different functions of the software used by a plurality of users, and the script is set as the script simulating the same or different functions of the software used by 350 users. The initial test environment is a test environment to be configured. Based on each test environment simulation configuration file, carrying out one-to-one test environment simulation configuration on the initial test environment, and loading scripts in the initial test environment. And taking the initial test environment after the test environment simulation configuration is completed as a software performance test environment. The efficiency of the simulation configuration of the software performance test environment is improved.
In one embodiment, the step 3: comparing with a preset software performance test template, performing software performance test on the target software in the software performance test environment to obtain a software performance test result, wherein the software performance test result comprises:
analyzing a plurality of groups of software performance test evaluation bases corresponding to each other in the software performance test templates to obtain sub templates and software performance test evaluation sub templates;
comparing each software performance test evaluation basis with an acquisition sub-template to acquire a software performance test evaluation basis of the target software in the software performance test environment;
comparing the corresponding software performance test evaluation sub-templates, and performing software performance test evaluation according to the software performance test evaluation basis to obtain a software performance test evaluation result;
and integrating each software performance test evaluation result to obtain a software performance test result.
The working principle and the beneficial effects of the technical scheme are as follows:
the software performance test evaluation corresponds to a software performance test item according to the obtained sub-template and the software performance test evaluation sub-template [ for example: the test item is a software throughput test, and the software performance test evaluation is based on the fact that the obtaining sub-template is the throughput of the obtaining target software in the software performance test environment, the software performance test evaluation sub-template is the throughput which is more than or equal to 100bps, and the evaluation value is 90 minutes. And performing software performance test based on the software performance test evaluation basis, namely acquiring a sub-template and a software performance test evaluation sub-template, wherein each software performance test evaluation result forms a software performance test result to complete the software performance test. And the testing efficiency of the software performance test is improved.
In one embodiment, the step 4: outputting the software performance test result, including:
acquiring a preset terminal node corresponding to the development group of the target software;
and delivering the software performance test result to the terminal node.
The working principle and the beneficial effects of the technical scheme are as follows:
the preset terminal node corresponding to the development group is a network node and is in communication butt joint with intelligent terminals [ mobile phones, PDA devices and the like ] used by developers in the development group. And delivering the software performance test result to the terminal node, and enabling a developer to check through the intelligent terminal. And the output efficiency of the software performance test result output is improved.
In one embodiment, acquiring historical software use extremes after other software meeting the software requirements has been historically brought online from a large data platform includes:
acquiring information sources and receiving time stamps of the use extreme cases of the historical software from a big data platform;
acquiring historical first credibility of the information source corresponding to the receiving time stamp;
if the first historical reliability is greater than or equal to a preset historical reliability threshold, acquiring a source category of the information source corresponding to the big data platform, wherein the source category comprises: active and passive sources;
when the source category of the information source is an active source, acquiring the use extreme condition of the historical software;
and when the source type of the information source is a passive source, acquiring a second credibility of the information source corresponding to the passive intention of the history software use extreme case, and acquiring the history software use extreme case if the second credibility is greater than or equal to a preset credibility threshold value.
The working principle and the beneficial effects of the technical scheme are as follows:
the big data platform is used for conveniently acquiring data, but the quality of the acquired data is not guaranteed. Therefore, quality verification is required in acquiring the history software use extreme cases. When the big data platform collects the use extreme cases of the historical software, the information sources of the big data platform (a software performance testing mechanism with CMA qualification, a software announcement website, an application store background and the like) and the receiving time stamp (the time for representing the use extreme cases of the historical software received by the big data platform) are recorded. The historical first trustworthiness of the information source corresponding to the receipt timestamp is obtained [ trustworthiness of the information source at the time the big data platform receives the historical software use extreme ]. If the first historical reliability is greater than or equal to the preset historical reliability threshold [ for example: 95, obtain information source corresponding to big data platform's source category [ information source provides history software use extreme case's category ], source category is divided into initiative source [ big data platform self-obtain, for example: crawl from software bulletin websites) and passive sources [ information sources are provided to big data platforms, for example: the software performance testing mechanism provides for a big data platform. When the source type is active source, the historical first credibility of the information source is enough, and the information source is directly acquired. When the source type is a passive source, obtaining a second credibility of the information source corresponding to a passive intention of the history software in use extreme cases (intention of the information source to actively provide information), if the second credibility is greater than or equal to a preset credibility threshold (for example: and 90, passive intention is credible and acquisition is performed. Before the historical software uses extreme cases to acquire, the historical software provided by the big data platform uses extreme cases to carry out quality verification, so that the acquisition quality is improved, the applicability of data acquisition by using the big data platform is improved, in addition, different quality verification is carried out according to different categories of information sources, and the comprehensiveness and rationality of the quality verification are improved.
In one embodiment, obtaining a historical first confidence that the source of information corresponds to the receive timestamp comprises:
acquiring a plurality of evaluation records of the big data platform for performing credible evaluation on the information source in preset time before and after the receiving time stamp;
inquiring a preset time difference weight table, and determining a time difference weight corresponding to the time difference between the record time stamp of each evaluation record and the receiving time stamp;
comparing a preset evaluation record evaluation determination template, performing evaluation determination on the evaluation record to obtain an evaluation value, and giving the time difference weight corresponding to the evaluation value to obtain a first target value;
and accumulating and calculating each first target value to obtain historical first credibility of the information source corresponding to the receiving time stamp.
The working principle and the beneficial effects of the technical scheme are as follows:
the big data platform may perform a trusted evaluation of the information source at regular time [ e.g.: qualification integrity evaluation, etc.), and an evaluation record is generated. Acquiring preset time before and after receiving the time stamp [ for example: and (3) performing a plurality of evaluation records of the credible evaluation on the information source within 3 days. The preset time difference weight table has time difference weights corresponding to different time differences, generally, when the time difference is a negative value, the smaller the time difference is, the smaller the time difference weight is, and when the time difference is a positive value, the larger the time difference is, the smaller the time difference weight is, and the more the credible evaluation is after the historical software use extreme condition is received. Determining templates against a preset evaluation record evaluation [ e.g.: the qualification integrity is more than or equal to 80 percent, and the evaluation value is 100; the qualification integrity is less than 80 percent, the evaluation value is 50, the evaluation record is evaluated and determined, the evaluation value is obtained, and the time difference weight corresponding to the evaluation value is given [ the given formula is: h=t·g, H is a first target value, T is a time difference weight, and T is an evaluation value), and the first target value is obtained. Accumulating each target value to obtain a historical first credibility [ accumulating calculation formula is:j is the historical first credibility, hi is the ith first target value, and n is the total number of the first target values. And the historical first credibility is comprehensively determined based on the credibility evaluation of the big data platform on the information source at the time nearby the receiving time stamp, so that the accuracy and the determination efficiency of the historical first credibility determination are improved.
In one embodiment, obtaining a second confidence that the source of information corresponds to a passive intent of the historical software use extremity comprises:
resolving intent weights of a plurality of sub-intents in the passive intent;
querying a preset intention credibility determining experience library, and determining credibility determining experiences corresponding to each sub intention;
generating a template based on a preset credibility determination rule, and generating a credibility determination rule according to the credibility determination experience;
based on the credibility determining rule, carrying out credibility determination on the corresponding sub intention to obtain a determined value, and giving intention weight corresponding to the determined value to obtain a second target value;
and accumulating and calculating each second target value to obtain second credibility of the passive intention.
The working principle and the beneficial effects of the technical scheme are as follows:
passive intent includes multiple sub-intents [ e.g.: actively providing data for the big data platform is convenient for acquiring data from the big data platform later, and cooperation is realized. The greater the intent weight, the higher the propensity of the representative information source to sub-intent. The confidence determination experience corresponding to the child intention is a method of determining whether the child intention is authentic [ for example: and verifying whether the information source subsequently carries out information acquisition on the big data platform, if so, carrying out true and credible sub-intention. Generating trusted determination rules from the trusted determination experience [ e.g.: and counting the number of times of information acquisition to a big data platform by the information source, wherein the more the number of times is, the larger the determined value is. Based on the credibility determination rule, credibility determination is carried out on the corresponding sub intention, a determination value is obtained, intention weight corresponding to the determination value is given, and a second target value [ giving the same meaning as giving the weight of the time difference ] is obtained. And accumulating each second target value to obtain second credibility [ accumulation is the same as the accumulation of the first target value ]. Generally, the big data platform does not receive data of information sources, the information sources are required to provide intentions, and the intentions are subjected to trusted verification, so that the trusted degree of the use extreme situation of the historical software can be determined, and the reliability of the use extreme situation of the historical software is improved.
The application provides a software performance test simulation system, as shown in FIG. 2, comprising:
the prediction module 1 is used for predicting the software use extreme situation of the target software after being online;
a configuration module 2, configured to simulate a configuration software performance test environment based on the software use extreme;
the test module 3 is used for comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
and the output module 4 is used for outputting the software performance test result.
In one embodiment, the predicting module 1 predicts an extreme case of software usage after the target software is online, including:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
and integrating each target situation to obtain the software use extreme situation after the target software is online.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (5)

1. A software performance test simulation method, comprising:
step 1: predicting the software use extreme situation of the target software after being online;
step 2: simulating and configuring a software performance test environment based on the software use extreme conditions;
step 3: comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
step 4: outputting the software performance test result;
the step 1: predicting software use extremes after target software is online, including:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
integrating each target situation to obtain the software use extreme situation of the target software after being online;
acquiring historical software use extreme cases of other software meeting the software requirement after being historically online from a big data platform, wherein the historical software use extreme cases comprise:
acquiring information sources and receiving time stamps of the use extreme cases of the historical software from a big data platform;
acquiring historical first credibility of the information source corresponding to the receiving time stamp;
if the first historical reliability is greater than or equal to a preset historical reliability threshold, acquiring a source category of the information source corresponding to the big data platform, wherein the source category comprises: active and passive sources;
when the source category of the information source is an active source, acquiring the use extreme condition of the historical software;
when the source type of the information source is a passive source, acquiring a second credibility of the information source corresponding to the passive intention of the history software use extreme case, and acquiring the history software use extreme case if the second credibility is greater than or equal to a preset credibility threshold value;
obtaining a historical first confidence that the information source corresponds to the receive timestamp, comprising:
acquiring a plurality of evaluation records of the big data platform for performing credible evaluation on the information source in preset time before and after the receiving time stamp;
inquiring a preset time difference weight table, and determining a time difference weight corresponding to the time difference between the record time stamp of each evaluation record and the receiving time stamp;
comparing a preset evaluation record evaluation determination template, performing evaluation determination on the evaluation record to obtain an evaluation value, and giving the time difference weight corresponding to the evaluation value to obtain a first target value;
accumulating and calculating each first target value to obtain a historical first credibility of the information source corresponding to the receiving time stamp;
obtaining a second confidence that the source of information corresponds to a passive intent of the historical software use extremity, comprising:
resolving intent weights of a plurality of sub-intents in the passive intent;
querying a preset intention credibility determining experience library, and determining credibility determining experiences corresponding to each sub intention;
generating a template based on a preset credibility determination rule, and generating a credibility determination rule according to the credibility determination experience;
based on the credibility determining rule, carrying out credibility determination on the corresponding sub intention to obtain a determined value, and giving intention weight corresponding to the determined value to obtain a second target value;
and accumulating and calculating each second target value to obtain second credibility of the passive intention.
2. The method for simulating software performance test according to claim 1, wherein said step 2: based on the software use extreme, simulating a configuration software performance test environment, including:
acquiring a preset test environment simulation configuration file generation template corresponding to each situation type;
generating a template based on the test environment simulation configuration file, and generating a test environment simulation configuration file according to the target condition corresponding to the condition type;
acquiring a preset initial test environment;
based on each test environment simulation configuration file, carrying out one-to-one test environment simulation configuration on the initial test environment;
and taking the initial test environment after the test environment simulation configuration is completed as a software performance test environment.
3. A software performance test simulation method according to claim 1, wherein said step 3: comparing with a preset software performance test template, performing software performance test on the target software in the software performance test environment to obtain a software performance test result, wherein the software performance test result comprises:
analyzing a plurality of groups of software performance test evaluation bases corresponding to each other in the software performance test templates to obtain sub templates and software performance test evaluation sub templates;
comparing each software performance test evaluation basis with an acquisition sub-template to acquire a software performance test evaluation basis of the target software in the software performance test environment;
comparing the corresponding software performance test evaluation sub-templates, and performing software performance test evaluation according to the software performance test evaluation basis to obtain a software performance test evaluation result;
and integrating each software performance test evaluation result to obtain a software performance test result.
4. The method for simulating software performance test according to claim 1, wherein said step 4: outputting the software performance test result, including:
acquiring a preset terminal node corresponding to the development group of the target software;
and delivering the software performance test result to the terminal node.
5. A software performance test simulation system, comprising:
the prediction module is used for predicting the software use extreme situation of the target software after being online;
the configuration module is used for simulating and configuring a software performance test environment based on the software use extreme conditions;
the test module is used for comparing with a preset software performance test template, and performing software performance test on the target software in the software performance test environment to obtain a software performance test result;
the output module is used for outputting the software performance test result;
the prediction module predicts the software use extreme condition of the target software after being on line, and comprises the following steps:
acquiring software information of the target software;
generating a template based on a preset software requirement, and generating a software requirement according to the software information;
acquiring historical software use extreme conditions of other software which meets the software requirement after being historically online from a big data platform;
classifying the history software using extreme cases to obtain a case set of a plurality of case types;
acquiring a preset condition screening rule corresponding to each condition type;
screening out target conditions from the condition sets corresponding to the condition types based on the condition screening rules;
integrating each target situation to obtain the software use extreme situation of the target software after being online;
acquiring historical software use extreme cases of other software meeting the software requirement after being historically online from a big data platform, wherein the historical software use extreme cases comprise:
acquiring information sources and receiving time stamps of the use extreme cases of the historical software from a big data platform;
acquiring historical first credibility of the information source corresponding to the receiving time stamp;
if the first historical reliability is greater than or equal to a preset historical reliability threshold, acquiring a source category of the information source corresponding to the big data platform, wherein the source category comprises: active and passive sources;
when the source category of the information source is an active source, acquiring the use extreme condition of the historical software;
when the source type of the information source is a passive source, acquiring a second credibility of the information source corresponding to the passive intention of the history software use extreme case, and acquiring the history software use extreme case if the second credibility is greater than or equal to a preset credibility threshold value;
obtaining a historical first confidence that the information source corresponds to the receive timestamp, comprising:
acquiring a plurality of evaluation records of the big data platform for performing credible evaluation on the information source in preset time before and after the receiving time stamp;
inquiring a preset time difference weight table, and determining a time difference weight corresponding to the time difference between the record time stamp of each evaluation record and the receiving time stamp;
comparing a preset evaluation record evaluation determination template, performing evaluation determination on the evaluation record to obtain an evaluation value, and giving the time difference weight corresponding to the evaluation value to obtain a first target value;
accumulating and calculating each first target value to obtain a historical first credibility of the information source corresponding to the receiving time stamp;
obtaining a second confidence that the source of information corresponds to a passive intent of the historical software use extremity, comprising:
resolving intent weights of a plurality of sub-intents in the passive intent;
querying a preset intention credibility determining experience library, and determining credibility determining experiences corresponding to each sub intention;
generating a template based on a preset credibility determination rule, and generating a credibility determination rule according to the credibility determination experience;
based on the credibility determining rule, carrying out credibility determination on the corresponding sub intention to obtain a determined value, and giving intention weight corresponding to the determined value to obtain a second target value;
and accumulating and calculating each second target value to obtain second credibility of the passive intention.
CN202211438673.6A 2022-11-17 2022-11-17 Software performance test simulation method and system Active CN116185815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211438673.6A CN116185815B (en) 2022-11-17 2022-11-17 Software performance test simulation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211438673.6A CN116185815B (en) 2022-11-17 2022-11-17 Software performance test simulation method and system

Publications (2)

Publication Number Publication Date
CN116185815A CN116185815A (en) 2023-05-30
CN116185815B true CN116185815B (en) 2023-12-08

Family

ID=86441080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211438673.6A Active CN116185815B (en) 2022-11-17 2022-11-17 Software performance test simulation method and system

Country Status (1)

Country Link
CN (1) CN116185815B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765681A (en) * 2015-03-18 2015-07-08 株洲南车时代电气股份有限公司 Automated train drive control software testing system and method
CN110134591A (en) * 2019-04-19 2019-08-16 平安普惠企业管理有限公司 A kind of software method for testing pressure and system, electronic equipment
CN110826071A (en) * 2019-09-24 2020-02-21 平安科技(深圳)有限公司 Software vulnerability risk prediction method, device, equipment and storage medium
CN113934640A (en) * 2021-11-10 2022-01-14 合众新能源汽车有限公司 Method and system for automatically testing software
CN113988616A (en) * 2021-10-27 2022-01-28 上海倍通企业信用征信有限公司 Enterprise risk assessment system and method based on industry data
CN114168166A (en) * 2022-02-11 2022-03-11 杭州斯诺康技术有限公司 Installation configuration method and system of indoor intelligent wireless access equipment
CN114860617A (en) * 2022-07-06 2022-08-05 上海金仕达软件科技有限公司 Intelligent pressure testing method and system
CN114978944A (en) * 2022-05-13 2022-08-30 北京百度网讯科技有限公司 Pressure testing method, device and computer program product

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150106154A1 (en) * 2013-10-15 2015-04-16 Actimize Ltd. System and method for financial risk assessment
US10540265B2 (en) * 2016-06-30 2020-01-21 International Business Machines Corporation Using test workload run facts and problem discovery data as input for business analytics to determine test effectiveness

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765681A (en) * 2015-03-18 2015-07-08 株洲南车时代电气股份有限公司 Automated train drive control software testing system and method
CN110134591A (en) * 2019-04-19 2019-08-16 平安普惠企业管理有限公司 A kind of software method for testing pressure and system, electronic equipment
CN110826071A (en) * 2019-09-24 2020-02-21 平安科技(深圳)有限公司 Software vulnerability risk prediction method, device, equipment and storage medium
CN113988616A (en) * 2021-10-27 2022-01-28 上海倍通企业信用征信有限公司 Enterprise risk assessment system and method based on industry data
CN113934640A (en) * 2021-11-10 2022-01-14 合众新能源汽车有限公司 Method and system for automatically testing software
CN114168166A (en) * 2022-02-11 2022-03-11 杭州斯诺康技术有限公司 Installation configuration method and system of indoor intelligent wireless access equipment
CN114978944A (en) * 2022-05-13 2022-08-30 北京百度网讯科技有限公司 Pressure testing method, device and computer program product
CN114860617A (en) * 2022-07-06 2022-08-05 上海金仕达软件科技有限公司 Intelligent pressure testing method and system

Also Published As

Publication number Publication date
CN116185815A (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN107122258B (en) Method and equipment for checking state code of test interface
CN111858242B (en) System log abnormality detection method and device, electronic equipment and storage medium
CN110362473B (en) Test environment optimization method and device, storage medium and terminal
CN111897724B (en) Automatic testing method and device suitable for cloud platform
US20220300820A1 (en) Ann-based program testing method, testing system and application
CN110083581B (en) Log tracing method and device, storage medium and computer equipment
CN106095688A (en) A kind of software performance testing method and device
CN102025555B (en) Method and device for testing IP multimedia sub-system performance
CN115242657A (en) Network security evaluation method, system, storage medium and electronic equipment
CN116185815B (en) Software performance test simulation method and system
CN112486833B (en) Software testing system and method capable of dynamically reconstructing on demand and oriented to software defined satellite
CN113742577A (en) AB test scheme processing method, device, equipment and storage medium based on SaaS
CN112948262A (en) System test method, device, computer equipment and storage medium
RU2532714C2 (en) Method of acquiring data when evaluating network resources and apparatus therefor
CN112181784A (en) Code fault analysis method and system based on bytecode injection
CN111930625A (en) Log obtaining method, device and system based on cloud service platform
CN111191792A (en) Data distribution method and device and computer equipment
CN115952098A (en) Performance test tuning scheme recommendation method and system
CN114721969A (en) Method and device for separating interface automation test data and test codes
CN117176623B (en) Pressure testing method and system based on flow playback
CN113791980B (en) Conversion analysis method, device and equipment for test cases and storage medium
CN112737804B (en) Network performance testing method, device and server
CN117331846B (en) Internet-based software development, operation, test and management system
CN115203606A (en) Product testing and checking method and device and computer equipment
CN117743128A (en) Test case construction method, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant