WO2020233021A1 - 基于智能决策的测试结果分析方法及相关装置 - Google Patents

基于智能决策的测试结果分析方法及相关装置 Download PDF

Info

Publication number
WO2020233021A1
WO2020233021A1 PCT/CN2019/118452 CN2019118452W WO2020233021A1 WO 2020233021 A1 WO2020233021 A1 WO 2020233021A1 CN 2019118452 W CN2019118452 W CN 2019118452W WO 2020233021 A1 WO2020233021 A1 WO 2020233021A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
performance
defect
online
result
Prior art date
Application number
PCT/CN2019/118452
Other languages
English (en)
French (fr)
Inventor
姚宏志
Original Assignee
平安普惠企业管理有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安普惠企业管理有限公司 filed Critical 平安普惠企业管理有限公司
Publication of WO2020233021A1 publication Critical patent/WO2020233021A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Definitions

  • This application relates to the field of computer technology, and in particular to a test result analysis method and related devices based on intelligent decision-making.
  • Performance testing plays a very important role in the process of controlling application quality.
  • relevant technical personnel usually analyze the performance test results of the application based on experience, and the analysis efficiency is relatively low. Therefore, how to improve the efficiency of analyzing performance test results has become an urgent problem to be solved.
  • the embodiments of the present application provide a test result analysis method and related devices based on intelligent decision-making, which can automatically and intelligently analyze the performance test results, thereby improving the analysis efficiency of the performance test results.
  • the embodiments of the present application provide a test result analysis method based on intelligent decision-making, including:
  • the test reference data of the first application is an online application
  • the test reference data of the first application includes the performance test result of the first application and corresponds to the performance test result of the first application The performance test evaluation and the performance defect feedback result after the first application is launched;
  • the performance prediction model is used to analyze the performance test result of the second application to obtain a target performance test evaluation corresponding to the performance test result of the second application and a performance defect prediction result after the second application is online.
  • an embodiment of the present application provides a test result analysis device based on intelligent decision-making, including:
  • the acquiring unit is configured to acquire test reference data of a first application, where the first application is an online application; the test reference data of the first application includes the performance test result of the first application, and the first application The performance test evaluation corresponding to the performance test result and the performance defect feedback result after the first application is launched;
  • a processing unit configured to use the test reference data of the first application to train a preset model to obtain a performance prediction model
  • the obtaining unit is further configured to obtain a performance test result of a second application, where the second application is an offline application;
  • the processing unit is further configured to analyze the performance test result of the second application by using the performance prediction model to obtain a target performance test evaluation corresponding to the performance test result of the second application, and the second application Performance defect prediction results after going online.
  • an embodiment of the present application provides an electronic device including a processor and a memory, the processor and the memory are connected to each other, wherein the memory is used to store a computer program, and the computer program includes program instructions.
  • the processor is configured to call the program instructions to execute the method as described in the first aspect.
  • an embodiment of the present application provides a computer non-volatile readable storage medium, the computer non-volatile readable storage medium stores a computer program, the computer program includes program instructions, and the program instructions When executed by a processor, the processor is caused to execute the method as described in the first aspect.
  • the electronic device can use the test reference data of the first application to train a preset model to obtain a performance prediction model, and use the performance prediction model to analyze the performance test results of the second application to obtain a comparison with the second application.
  • the performance test result of the application corresponds to the target performance test evaluation and the performance defect prediction result after the second application is online, so as to automatically and intelligently analyze the performance test result of the application, thereby improving the analysis efficiency of the performance test result.
  • FIG. 1 is a schematic flowchart of a test result analysis method based on intelligent decision-making according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of another test result analysis method based on intelligent decision-making provided by an embodiment of the present application
  • FIG. 3 is a schematic structural diagram of a test result analysis device based on intelligent decision-making according to an embodiment of the present application
  • Fig. 4 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of a test result analysis method based on intelligent decision-making according to an embodiment of this application.
  • the method can be applied to an electronic device, and the electronic device can be a server or a terminal. Specifically, the method may include the following steps:
  • test reference data of a first application where the first application is an online application.
  • the first application may be one application or a collective term for multiple applications.
  • the test reference data of the first application includes the performance test result of the first application, the performance test evaluation corresponding to the performance test result of the first application, and the performance defect feedback result after the first application is online.
  • the performance test result of the first application is a test result of the performance test index of the first application.
  • the performance test indicator is a memory occupancy rate
  • the test result of the memory occupancy rate may be a memory occupancy rate of 30%.
  • the performance test index of the first application is at least one, and the at least one is one or more.
  • the performance test result of the first application is also at least one.
  • the performance test index of each application in the first application is at least one.
  • the performance test index of each application can be the same or different according to actual test requirements.
  • the performance test result of each application is also at least one.
  • the performance test index of the first application may be an index for performing a performance test of the first application in a preset test scenario.
  • the first application is application A
  • the performance test index of application A is an index for performing a performance test on application A based on test scenario 1.
  • the first application is application A and application B
  • the performance test index of application A is the performance test index of application A based on test scenario 1
  • the performance test index of application B is application B based on test scenario 1.
  • the performance test index of the application B is an index for performing a performance test on the application B based on the test scenario 2.
  • the test reference data of the first application includes the test reference data of application A, which is obtained after performing a performance test on the first application based on test scenario 1
  • the second The test reference data of the application includes the test reference data of the application B, which is obtained after the performance test of the first application is performed on the test scenario 1 of the second application based on the test scenario 1.
  • the performance test evaluation of the first application can be used to characterize whether the performance test of the first application passes.
  • the performance test evaluation includes, but is not limited to, embodied in various forms such as words and numbers.
  • the performance test evaluation of the first application may be performed on at least one performance test result of the first application Comprehensive analysis to obtain a comprehensive evaluation; or it may also be to separately analyze each performance test result in at least one performance test result of the first application to obtain an evaluation corresponding to each performance test result.
  • the performance test evaluation of the first application may include a comprehensive analysis of at least performance test results of each application to obtain a comprehensive evaluation of each application; or At least one performance test result of each application is analyzed separately for each performance test result, and the evaluation corresponding to each performance test result of each application is obtained.
  • the feedback result of the performance defect of the first application indicates whether the first application has a performance defect after it goes online.
  • the performance defect feedback result of the first application may include a corresponding defect category and/or a corresponding defect level.
  • the defect level includes, but is not limited to, the form of numbers or text.
  • the test reference data of the first application can be obtained locally.
  • the test reference data of the first application may be obtained from the server corresponding to the first application.
  • obtaining the test reference data of the first application by the electronic device from the server corresponding to the first application may include: the electronic device receives the test reference data of the first application sent by the server corresponding to the first application; Or, the electronic device sends a first data acquisition request (for requesting the server corresponding to the first application to feed back the test reference data of the first application) to the server corresponding to the first application, and receives the server corresponding to the first application The sent feedback result of the performance defect of the first application.
  • the server corresponding to the aforementioned first application may be an application server corresponding to the first application, or a test server corresponding to the first application.
  • test reference data of the first application uses the test reference data of the first application to train a preset model to obtain a performance prediction model.
  • the electronic device may use the test reference data of the first application to train the preset model to obtain the performance prediction model.
  • the electronic device uses the test reference data of the first application to train the preset model to obtain the performance prediction model, which may include: the electronic device uses the test reference data of the first application as input data of the preset model to Train the preset model, and use the trained preset model as a performance prediction model.
  • the preset model may be a convolutional neural network model, or may also be a model such as a decision tree model.
  • the electronic device can obtain the performance test result of the second application.
  • the performance test result of the second application is the test result of the performance test index of the second application.
  • the performance test index of the second application is an index for performing the second performance test under the preset test scenario.
  • the performance test result of the second application can be obtained locally.
  • the performance test result of the second application can be obtained from the server corresponding to the second application.
  • obtaining the test reference data of the second application by the electronic device from the server corresponding to the second application may include: the electronic device receives the performance test result of the second application sent by the server corresponding to the second application; Or, the electronic device sends a second data acquisition request (for requesting the server corresponding to the second application to feed back the performance test result of the second application) to the server corresponding to the second application, and receives the server corresponding to the second application The sent performance test result of the second application.
  • the corresponding server corresponding to the aforementioned second application may be an application server corresponding to the second application, or a test server corresponding to the second application.
  • the electronic device can use the performance prediction model to analyze the performance test result of the second application to obtain the target performance test evaluation corresponding to the performance test result of the second application. Performance defect prediction results.
  • the electronic device uses the performance prediction model to analyze the performance test result of the second application to obtain a target performance test evaluation corresponding to the performance test result of the second application, and the performance of the second application after it is online
  • the defect prediction result may include: the electronic device inputs the performance test result of the second application into the performance prediction model for analysis, and outputs the target performance test evaluation corresponding to the performance test result of the second application through the performance prediction model , The performance defect prediction result after the second application goes online.
  • the electronic device can use the test reference data of the first application to train a preset model to obtain a performance prediction model, and use the performance prediction model to analyze the performance test results of the second application , Obtain the target performance test evaluation corresponding to the performance test result of the second application, and the performance defect prediction result after the second application is online, so as to automatically and intelligently analyze the performance test result of the application, thereby improving the analysis of the performance test result effectiveness.
  • FIG. 2 is a schematic flowchart of another test result analysis method based on intelligent decision-making according to an embodiment of this application.
  • the method can be applied to an electronic device, and the electronic device can be a server or a terminal. Specifically, the method may include the following steps:
  • test reference data of a first application where the first application is an online application
  • test reference data of the first application uses the test reference data of the first application to train a preset model to obtain a performance prediction model.
  • steps S201-S204 can refer to steps S101-S104 in the embodiment of FIG. 1, and details are not described herein in the embodiment of the present application.
  • the performance defect feedback result after the second application is online can be obtained.
  • obtaining the performance defect feedback result of the second application by the electronic device may include: the electronic device sends a data obtaining request to the server corresponding to the second application; the data obtaining request is used to request the server corresponding to the second application Feedback the performance defect feedback result of the second application; receive the performance defect feedback result of the second application sent by the server corresponding to the second application.
  • the electronic device may record the online time of the second application, and start counting the online time of the second application from the online time, so as to obtain the online time of the second application when the online time exceeds the preset time. Feedback results of performance defects after the second application went online.
  • the server corresponding to the second application may record the upper limit time of the second application, and start counting the online time of the second application from the online time, so that the electronic device can be used when the online time exceeds the preset time. To obtain the feedback result of the performance defect after the second application is online.
  • S206 Determine whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online.
  • the electronic device can determine whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, and the performance defect prediction result after the second application is online is the same as the first application. Second, when the performance defect feedback results after the application goes online are inconsistent, the electronic device can use the test reference data of the second application to train the performance prediction model to correct the performance prediction model.
  • the embodiment of the application modifies the performance prediction model to achieve the purpose of optimizing the performance prediction model, which is beneficial to improve the analysis accuracy of the application performance test results.
  • the test reference data of the second application includes: the performance test result of the second application, the performance test evaluation corresponding to the performance test result of the second application, and the performance defect feedback result after the second application is online.
  • the electronic device determines whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, which may include: if the performance defect prediction result indicates that the second application is in Performance defects will occur after going online, and the feedback result of the performance defects indicates that the second application has no performance defects after going online, the electronic device determines the performance defect prediction result after the second application goes online and the performance after the second application goes online The defect feedback results are inconsistent; or, if the performance defect prediction result indicates that the second application will not have a performance defect after it goes online, and the performance defect feedback result indicates that the second application has a performance defect after it goes online, the electronic device determines The performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the performance defect prediction result may be pass or fail, or may also be a number indicating whether or not it is passed, such as 0 or 1, where 0 means fail and 1 means pass.
  • the performance defect prediction result after the second application is online includes the first defect category
  • the performance defect feedback result after the second application is online includes the second defect category
  • Whether the performance defect prediction result is consistent with the performance defect feedback result after the second application is online includes: the electronic device determines whether the first defect category is the same as the second defect category; when the first defect category is the same as the second defect category When they are not the same, the electronic device determines that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the first defect category refers to a predicted defect category
  • the second defect category refers to a feedback defect category.
  • the electronic device can establish a corresponding relationship between the performance test index and the defect category, and according to the corresponding relationship, the electronic device can query the performance test indicator corresponding to the performance test result of the second application and the corresponding defect category; The electronic device can determine whether the queried defect category includes the second defect category; when the queried defect category includes the second defect category, trigger the judgment whether the first defect category is the same as the second defect category A step of.
  • the electronic device queries whether a preset defect category set includes the second defect category; the defect category set includes a plurality of preset defect categories; when the defect category set includes the second defect category, the electronic device The device triggers the step of determining whether the first defect category is the same as the second defect category.
  • the performance defect prediction result after the second application is online may include the first defect description information
  • the performance defect feedback result after the second application is online may include the second defect description information
  • the first defect description information refers to predicted defect description information
  • the second defect description information refers to feedback defect description information.
  • Each defect category can correspond to at least one defect description information.
  • the performance defect prediction result after the second application is online includes a first defect level
  • the performance defect feedback result after the second application is online includes a second defect level
  • the performance of the second application is determined Whether the defect prediction result is consistent with the performance defect feedback result after the second application is online, including: judging whether the first defect level is the same as the second defect level; when the first defect level is different from the second defect level , It is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the defect levels may include first, second, and third levels, where the severity of the performance defect indicated by the first level is lower than that of the second level, and the severity of the performance defect indicated by the second level is lower than the third level.
  • the electronic device can also obtain the test reference data of the first application and the test reference data of the second application, train a specified model, and obtain a new performance prediction model for subsequent performance test results of the application Analysis process.
  • the designated model is the aforementioned preset model.
  • the electronic device can obtain the performance defect feedback result of the second application, and determine whether the performance defect prediction result after the second application is online and the performance defect feedback result after the second application is online Consistent, when the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online, the test reference data of the second application can be used to train the performance prediction model to determine the performance The prediction model is revised to improve the accuracy of the model prediction.
  • FIG. 3 is a schematic structural diagram of a test result analysis device based on intelligent decision provided by an embodiment of this application, and the device can be applied to electronic equipment.
  • the device may include:
  • the acquiring unit 31 is configured to acquire test reference data of a first application, where the first application is an online application; the test reference data of the first application includes the performance test result of the first application, and the first application The performance test evaluation corresponding to the performance test result of the application and the performance defect feedback result after the first application is online;
  • the processing unit 32 is configured to use the test reference data of the first application to train a preset model to obtain a performance prediction model;
  • the obtaining unit 31 is further configured to obtain a performance test result of a second application, where the second application is an offline application;
  • the processing unit 32 is further configured to use the performance prediction model to analyze the performance test result of the second application to obtain a target performance test evaluation corresponding to the performance test result of the second application, and the second The performance defect prediction results after the application goes live.
  • the acquiring unit 31 is further configured to acquire the performance defect feedback result after the second application is online when the online duration of the second application exceeds a preset duration.
  • the processing unit 32 is further configured to determine whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online; 2. When the performance defect prediction result after the application is online is inconsistent with the performance defect feedback result after the second application is online, use the test reference data of the second application to train the performance prediction model to compare the performance prediction model Make corrections; wherein, the test reference data of the second application includes: the performance test result of the second application, the performance test evaluation corresponding to the performance test result of the second application, and the performance test evaluation after the second application is online Performance defect feedback results.
  • the processing unit 32 determines whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, specifically if the performance defect prediction If the result indicates that the second application will have performance defects after being launched, and the performance defect feedback result indicates that the second application has no performance defects after being launched, the performance defect prediction result after the second application is online is determined It is inconsistent with the performance defect feedback result after the second application is online; or, if the performance defect prediction result indicates that the second application will not have a performance defect after it is online, and the performance defect feedback result indicates the second application If the second application has a performance defect after it is online, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the performance defect prediction result after the second application is online includes a first defect category
  • the performance defect feedback result after the second application is online includes a second defect category
  • the processing unit 32 determines Whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, specifically determining whether the first defect category is the same as the second defect category; When the first defect category is different from the second defect category, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the processing unit 32 is further configured to query whether the preset defect category set includes the second defect category; the defect category set includes multiple preset defect categories; When the defect category set includes the second defect category, the operation of determining whether the first defect category and the second defect category are the same is triggered.
  • the performance defect prediction result after the second application is online includes a first defect level
  • the performance defect feedback result after the second application is online includes a second defect level
  • the processing unit 32 determines Whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, specifically determining whether the first defect level is the same as the second defect level; When the first defect level is different from the second defect level, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the obtaining unit 31 obtains the performance defect feedback result of the second application, specifically sending a data obtaining request to the server corresponding to the second application; the data obtaining request is used to request the first application 2.
  • the server corresponding to the application feeds back the performance defect feedback result of the second application; receiving the performance defect feedback result of the second application sent by the server corresponding to the second application.
  • the electronic device can use the test reference data of the first application to train a preset model to obtain a performance prediction model, and use the performance prediction model to analyze the performance test results of the second application , Obtain the target performance test evaluation corresponding to the performance test result of the second application, and the performance defect prediction result after the second application is online, so as to automatically and intelligently analyze the performance test result of the application, thereby improving the analysis of the performance test result effectiveness.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • the electronic device described in this embodiment may include: one or more processors 1000 and memory 2000.
  • the electronic device may further include one or more input devices 3000 and one or more output devices 4000.
  • the processor 1000, the memory 2000, the input device 3000, and the output device 4000 may be connected to each other through a bus.
  • the input device 3000 and the output device 4000 may be standard wired or wireless communication interfaces.
  • the processor 1000 may be a central processing module (Central Processing Unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), and Application Specific Integrated Circuits (ASIC) , Ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 2000 may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as a magnetic disk memory.
  • the memory 2000 is used to store a set of program codes.
  • the input device 3000, the output device 4000 and the processor 1000 can call the program codes stored in the memory 2000. specifically:
  • the processor 1000 is configured to obtain test reference data of a first application, where the first application is an online application; the test reference data of the first application includes the performance test result of the first application, and the first application The performance test evaluation corresponding to the performance test result of the application and the performance defect feedback result after the first application is launched; use the test reference data of the first application to train a preset model to obtain a performance prediction model; obtain the performance of the second application According to the test result, the second application is an offline application; the performance test result of the second application is analyzed using the performance prediction model to obtain the target performance test evaluation, corresponding to the performance test result of the second application, The performance defect prediction result after the second application is online.
  • the processor 1000 is further configured to: when the online duration of the second application exceeds a preset duration, obtain a feedback result of the performance defect after the second application is online; and determine the performance of the second application after it is online Whether the defect prediction result is consistent with the performance defect feedback result after the second application is online; when the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online, use all
  • the test reference data of the second application trains the performance prediction model to modify the performance prediction model; wherein, the test reference data of the second application includes: the performance test result of the second application, and The performance test evaluation corresponding to the performance test result of the second application and the performance defect feedback result after the second application is online.
  • the processor 1000 determines whether the performance defect prediction result after the second application is online is consistent with the performance defect feedback result after the second application is online, specifically if the performance defect prediction result indicates the second application After the application is online, performance defects will occur, and the performance defect feedback result indicates that the second application has no performance defects after it is online, then it is determined that the performance defect prediction result after the second application is online and the second application The performance defect feedback results after going online are inconsistent; or, if the performance defect prediction result indicates that the second application will not have performance defects after going online, and the performance defect feedback result indicates that the second application will take place after going online If the performance defect is detected, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the performance defect prediction result after the second application is online includes a first defect category
  • the performance defect feedback result after the second application is online includes a second defect category
  • the processor 1000 determines that the second application is online Whether the subsequent performance defect prediction results are consistent with the performance defect feedback results after the second application goes online, specifically to determine whether the first defect category is the same as the second defect category; when the first defect category is When the second defect categories are not the same, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the processor 1000 is further configured to query whether the preset defect category set includes the second defect category; the defect category set includes a plurality of preset defect categories; when the defect category set includes the In the case of the second defect category, the operation of determining whether the first defect category and the second defect category are the same is triggered.
  • the performance defect prediction result after the second application is online includes a first defect level
  • the performance defect feedback result after the second application is online includes a second defect level
  • the processor 1000 determines that the second application is online Whether the subsequent performance defect prediction results are consistent with the performance defect feedback results after the second application is online, specifically to determine whether the first defect level is the same as the second defect level; when the first defect level is When the second defect levels are not the same, it is determined that the performance defect prediction result after the second application is online is inconsistent with the performance defect feedback result after the second application is online.
  • the processor 1000 obtains the performance defect feedback result of the second application, specifically by sending a data obtaining request to the server corresponding to the second application through the output device 4000; the data obtaining request is used to request the second application
  • the corresponding server feeds back the performance defect feedback result of the second application; and receives the performance defect feedback result of the second application sent by the server corresponding to the second application through the input device 3000.
  • the processor 1000, the input device 3000, and the output device 4000 described in the embodiments of the present application can perform the implementation manners described in the embodiments of FIGS. 1 to 2 and can also perform the implementation manners described in the embodiments of the present application. , I won’t repeat it here.
  • the functional modules in the various embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of sampling hardware, and can also be implemented in the form of sampling software functional modules.
  • the program can be stored in a computer non-volatile readable storage medium.
  • the computer non-volatile readable storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

本申请实施例公开了一种基于智能决策的测试结果分析方法及相关装置,其中,该方法包括:获取第一应用的测试参考数据,该第一应用为已上线应用;该第一应用的测试参考数据包括该第一应用的性能测试结果、与该第一应用的性能测试结果对应的性能测试评价和该第一应用上线后的性能缺陷反馈结果;利用该第一应用的测试参考数据训练预设模型,得到性能预测模型;获取第二应用的性能测试结果,该第二应用为未上线应用;利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价、该第二应用上线后的性能缺陷预测结果。采用本申请,可以提高对测试结果的分析效率。

Description

基于智能决策的测试结果分析方法及相关装置
本申请要求于2019年05月20日提交中国专利局、申请号为2019104208389、申请名称为“基于智能决策的测试结果分析方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种基于智能决策的测试结果分析方法及相关装置。
背景技术
性能测试在把控应用质量的过程中起着十分重要的作用。通常来说,在针对某应用的性能测试过程中,通常会由相关技术人员按照经验来分析应用的性能测试结果,其分析效率较为低下。因此,如何提高对性能测试结果的分析效率成为亟待解决的问题。
发明内容
本申请实施例提供了一种基于智能决策的测试结果分析方法及相关装置,可以自动化智能化地对性能测试结果进行分析,从而提高对性能测试结果的分析效率。
第一方面,本申请实施例提供了一种基于智能决策的测试结果分析方法,包括:
获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
获取第二应用的性能测试结果,所述第二应用为未上线应用;
利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
第二方面,本申请实施例提供了一种基于智能决策的测试结果分析装置,包括:
获取单元,用于获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
处理单元,用于利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
所述获取单元,还用于获取第二应用的性能测试结果,所述第二应用为未上线应用;
所述处理单元,还用于利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
第三方面,本申请实施例提供了一种电子设备,包括处理器和存储器,所述处理器和存储器相互连接,其中,所述存储器用于存储计算机程序,所述计算机程序包括程序指令,所述处理器被配置用于调用所述程序指令,执行如第一方面所述的方法。
第四方面,本申请实施例提供了一种计算机非易失性可读存储介质,所述计算机非易失性可读存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行如第一方面所述的方法。
综上所述,电子设备可以利用该第一应用的测试参考数据训练预设模型,得到性能预测模型,并利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价、该第二应用上线后的性能缺陷预测结果,以自动化智能化地分析应用的性能测试结果,从而提高对性能测试结果的分析效率。
附图说明
图1是本申请实施例提供的一种基于智能决策的测试结果分析方法的流程示意图;
图2是本申请实施例提供的另一种基于智能决策的测试结果分析方法的流程示意图;
图3是本申请实施例提供的一种基于智能决策的测试结果分析装置的结构示意图;
图4是本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
请参阅图1,为本申请实施例提供的一种基于智能决策的测试结果分析方法的流程示意图。该方法可以应用于电子设备,该电子设备可以为服务器或终端。具体的,该方法可以包括以下步骤:
S101、获取第一应用的测试参考数据,所述第一应用为已上线应用。
需要说明的是,该第一应用可以为一个应用,或为多个应用的统称。该第一应用的测 试参考数据包括该第一应用的性能测试结果、与该第一应用的性能测试结果对应的性能测试评价和该第一应用上线后的性能缺陷反馈结果。
其中,该第一应用的性能测试结果为针对该第一应用的性能测试指标的测试结果。例如,该性能测试指标为内存占用率,该内存占用率的测试结果可以为内存占用率30%。当该第一应用为一个应用时,该第一应用的性能测试指标为至少一个,该至少一个为一个或多个。相应地,该第一应用的性能测试结果也为至少一个。当该第一应用为多个应用的统称时,该第一应用中每个应用的性能测试指标为至少一个。每个应用的性能测试指标根据实际的测试需求可以相同或不同。相应地,每个应用的性能测试结果也为至少一个。
在一个实施例中,该第一应用的性能测试指标可以为在预设测试场景下对该第一应用进行性能测试的指标。例如,该第一应用为应用A,该应用A的性能测试指标为基于测试场景1对应用A进行性能测试的指标。或,该第一应用为应用A和应用B,该应用A的性能测试指标为基于测试场景1对应用A进行性能测试的指标,该应用B的性能测试指标为基于测试场景1对应用B进行性能测试的指标。或,该应用B的性能测试指标为基于测试场景2对应用B进行性能测试的指标。
或,该第一应用包括应用A和应用B,则该第一应用的测试参考数据包括的应用A的测试参考数据,是基于测试场景1对第一应用进行性能测试后得到的,该第二应用的测试参考数据包括的应用B的测试参考数据,是基于测试场景1对第二应用的测试场景1对第一应用进行性能测试后得到的。
其中,该第一应用的性能测试评价可以用于表征该第一应用的性能测试是否通过。在一个实施例中,该性能测试评价包括但不限于以文字、数字等多种形式体现。在一个实施例中,当该第一应用为一个应用,该第一应用的性能测试结果为至少一个时,该第一应用的性能测试评价可以是对该第一应用的至少一个性能测试结果进行综合分析,得到的一个综合评价;或还可以是对该第一应用的至少一个性能测试结果中每个性能测试结果分别进行分析,得到的每个性能测试结果对应的评价。当该第一应用为多个应用的统称时,该第一应用的性能测试评价可以包括对每个应用的至少性能测试结果进行综合分析,得到的每个应用的综合评价;或还可以是对每个应用的至少一个性能测试结果中每个性能测试结果分别进行分析,得到的每个应用的每个性能测试结果对应的评价。
其中,该第一应用的性能缺陷反馈结果指示该第一应用在上线后是否发生了性能缺陷。在一个实施例中,该第一应用的性能缺陷反馈结果可以包括对应的缺陷类别,和/或对应的 缺陷等级。该缺陷等级包括但不限于以数字或文字形式体现。
在一个实施例中,当该电子设备存储了该第一应用的测试参考数据时,可以从本地获取该第一应用的测试参考数据。
或,当该电子设备未存储该第一应用的测试参考数据时,可以从该第一应用对应的服务器获取该第一应用的测试参考数据。。
在一个实施例中,电子设备从该第一应用对应的服务器获取该第一应用的测试参考数据,可以包括:电子设备接收该第一应用对应的服务器发送的该第一应用的测试参考数据;或,电子设备发送第一数据获取请求(用于请求该第一应用对应的服务器反馈该第一应用的测试参考数据)至该第一应用对应的服务器,并接收由该第一应用对应的服务器发送的该第一应用的性能缺陷反馈结果。
在一个实施例中,前述第一应用对应的服务器,可以为该第一应用对应的应用服务器,或为该第一应用对应的测试服务器。
S102、利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型。
本申请实施例中,电子设备可以利用第一应用的测试参考数据训练预设模型,得到性能预测模型。
在一个实施例中,电子设备利用该第一应用的测试参考数据训练预设模型,得到性能预测模型,可以包括:电子设备将该第一应用的测试参考数据作为预设模型的输入数据,以训练该预设模型,并将训练后的预设模型作为性能预测模型。
在一个实施例中,该预设模型可以为卷积神经网络模型,或还可以为决策树模型等模型。
S103、获取第二应用的性能测试结果,所述第二应用为未上线应用。
本申请实施例中,电子设备可以获取第二应用的性能测试结果。其中,该第二应用的性能测试结果为针对该第二应用的性能测试指标的测试结果。在一个实施例中,该第二应用的性能测试指标尅以为在该预设测试场景下对该第二进行性能测试的指标。
在一个实施例中,当该电子设备存储了该第二应用的性能测试结果时,可以从本地获取该第二应用的性能测试结果。
或,当该电子设备未存储该第二应用的性能测试结果时,可以从该第二应用对应的服务器获取该第二应用的性能测试结果。
在一个实施例中,电子设备从该第二应用对应的服务器获取该第二应用的测试参考数 据,可以包括:电子设备接收该第二应用对应的服务器发送的该第二应用的性能测试结果;或,电子设备发送第二数据获取请求(用于请求该第二应用对应的服务器反馈该第二应用的性能测试结果)至该第二应用对应的服务器,并接收由该第二应用对应的服务器发送的该第二应用的性能测试结果。
在一个实施例中,前述第二应用对应的对应的服务器,可以为该第二应用对应的应用服务器,或为该第二应用对应的测试服务器。
S104、利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
本申请实施例中,电子设备可以利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价,该第二应用上线后的性能缺陷预测结果。
在一个实施例中,电子设备利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价,该第二应用上线后的性能缺陷预测结果,可以包括:电子设备将该第二应用的性能测试结果输入到该性能预测模型中进行分析,并经由该性能预测模型输出与该第二应用的性能测试结果对应的目标性能测试评价、该第二应用上线后的性能缺陷预测结果。
可见,图1所示的实施例中,电子设备可以利用该第一应用的测试参考数据训练预设模型,得到性能预测模型,并利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价、该第二应用上线后的性能缺陷预测结果,以自动化智能化地分析应用的性能测试结果,从而提高对性能测试结果的分析效率。
请参阅图2,为本申请实施例提供的另一种基于智能决策的测试结果分析方法的流程示意图。该方法可以应用于电子设备,该电子设备可以为服务器或终端。具体的,该方法可以包括以下步骤:
S201、获取第一应用的测试参考数据,所述第一应用为已上线应用;
S202、利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型。
S203、获取第二应用的性能测试结果,所述第二应用为未上线应用。
S204、利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
其中,步骤S201-S204可参见图1实施例中的步骤S101-S104,本申请实施例中在此不做赘述。
S205、当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果。
本申请实施例中,为了有效地监控第二应用的性能缺陷,可以当第二应用的上线时间超过预设时间阈值时,获取该第二应用上线后的性能缺陷反馈结果。
在一个实施例中,电子设备获取该第二应用的性能缺陷反馈结果,可以包括:电子设备发送数据获取请求至第二应用对应的服务器;该数据获取请求用于请求该第二应用对应的服务器反馈该第二应用的性能缺陷反馈结果;接收该第二应用对应的服务器发送的该第二应用的性能缺陷反馈结果。
在一个实施例中,电子设备可以记录该第二应用的上线时间,并从该上线时间开始统计该第二应用的上线时长,以便在该第二应用的上线时长超过预设时长时,获取该第二应用上线后的性能缺陷反馈结果。
在一个实施例中,该第二应用对应的服务器可以记录该第二应用的上限时间,并从该上线时间开始统计该第二应用的上线时长,以便电子设备在该上线时长超过预设时长时,获取该第二应用上线后的性能缺陷反馈结果。
S206、判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致。
S207、当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正。
本申请实施例,电子设备可以判断该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,当该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致时,电子设备可以利用该第二应用的测试参考数据,训练该性能预测模型,以对该性能预测模型进行修正。本申请实施例通过对该性能预测模型进行修正,可以达到优化该性能预测模型的目的,有利于提高对应用的性能测试结 果的分析准确度。其中,该第二应用的测试参考数据包括:该第二应用的性能测试结果、与该第二应用的性能测试结果对应的性能测试评价和该第二应用上线后的性能缺陷反馈结果。
在一个实施例中,电子设备判断该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,可以包括:若该性能缺陷预测结果指示该第二应用在上线后会发生性能缺陷,且该性能缺陷反馈结果指示该第二应用在上线后未发生性能缺陷,则电子设备确定该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致;或,若该性能缺陷预测结果指示该第二应用在上线后不会发生性能缺陷,且该性能缺陷反馈结果指示该第二应用在上线后发生了性能缺陷,则电子设备确定该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致。例如,该性能缺陷预测结果可以为通过或不通过,或还可以为表征是否通过的数字,如可以为0或1,其中,0表示不通过,1表示通过。
在一个实施例中,该第二应用上线后的性能缺陷预测结果包括第一缺陷类别,该第二应用上线后的性能缺陷反馈结果包括第二缺陷类别,电子设备判断该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,包括:电子设备判断该第一缺陷类别与该第二缺陷类别是否相同;当该第一缺陷类别与该第二缺陷类别不相同时,电子设备确定该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致。其中,该第一缺陷类别是指预测的缺陷类别,该第二缺陷类别是指反馈的缺陷类别。
在一个实施例中,电子设备可以建立性能测试指标与缺陷类别的对应关系,根据该对应关系,电子设备可以查询出该第二应用的性能测试结果对应的性能测试指标,所对应的缺陷类别;电子设备可以判断该查询出的缺陷类别是否包括该第二缺陷类别;当该查询出的缺陷类别包括该第二缺陷类别时,触发所述判断该第一缺陷类别与该第二缺陷类别是否相同的步骤。
或者,电子设备查询预设的缺陷类别集合是否包括所述第二缺陷类别;所述缺陷类别集合包括多个预设的缺陷类别;当所述缺陷类别集合包括所述第二缺陷类别时,电子设备触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的步骤。
在一个实施例中,该第二应用上线后的性能缺陷预测结果可以包括第一缺陷描述信息,该第二应用上线后的性能缺陷反馈结果可以包括第二缺陷描述信息;电子设备判断该第二 应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,包括:电子设备判断该第一缺陷描述信息与该第二缺陷描述信息是否相同;当该第一缺陷类别与该第二缺陷类别不相同时,电子设备确定该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致。其中,该第一缺陷描述信息是指预测的缺陷描述信息,该第二缺陷描述信息是指反馈的缺陷描述信息。每个缺陷类别可以对应至少一个缺陷描述信息。
在一个实施例中,该第二应用上线后的性能缺陷预测结果包括第一缺陷等级,该第二应用上线后的性能缺陷反馈结果包括第二缺陷等级,该判断该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,包括:判断该第一缺陷等级与该第二缺陷等级是否相同;当该第一缺陷等级与该第二缺陷等级不相同时,确定该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致。例如,缺陷等级可以包括一级、二级、三级,其中,一级指示的性能缺陷的严重程度低于二级、二级指示的性能缺陷的严重程度低于三级。
在一个实施例中,电子设备还可以获取该第一应用的测试参考数据、该第二应用的测试参考数据,训练指定模型,得到新的性能预测模型,以用于后续对应用的性能测试结果的分析过程。在一个实施例中,该指定模型为前述预设模型。
可见,图2所示的实施例中,电子设备可以获取第二应用的性能缺陷反馈结果,并判断该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果是否一致,当该第二应用上线后的性能缺陷预测结果与该第二应用上线后的性能缺陷反馈结果不一致时,可以利用该第二应用的测试参考数据,训练该性能预测模型,以对该性能预测模型进行修正,从而提高模型预测准确度。
请参阅图3,为本申请实施例提供的一种基于智能决策的测试结果分析装置的结构示意图,该装置可以应用于电子设备。具体的,该装置可以包括:
获取单元31,用于获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
处理单元32,用于利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
所述获取单元31,还用于获取第二应用的性能测试结果,所述第二应用为未上线应用;
所述处理单元32,还用于利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
在一种可选的实施方式中,获取单元31,还用于当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果。
在一种可选的实施方式中,处理单元32,还用于判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致;当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正;其中,所述第二应用的测试参考数据包括:所述第二应用的性能测试结果、与所述第二应用的性能测试结果对应的性能测试评价和所述第二应用上线后的性能缺陷反馈结果。
在一种可选的实施方式中,处理单元32判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为若所述性能缺陷预测结果指示所述第二应用在上线后会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后未发生性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致;或,若所述性能缺陷预测结果指示所述第二应用在上线后不会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后发生了性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
在一种可选的实施方式中,所述第二应用上线后的性能缺陷预测结果包括第一缺陷类别,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷类别,处理单元32判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷类别与所述第二缺陷类别是否相同;当所述第一缺陷类别与所述第二缺陷类别不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
在一种可选的实施方式中,处理单元32,还用于查询预设的缺陷类别集合是否包括所述第二缺陷类别;所述缺陷类别集合包括多个预设的缺陷类别;当所述缺陷类别集合包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的操 作。
在一种可选的实施方式中,所述第二应用上线后的性能缺陷预测结果包括第一缺陷等级,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷等级,处理单元32判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷等级与所述第二缺陷等级是否相同;当所述第一缺陷等级与所述第二缺陷等级不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
在一种可选的实施方式中,获取单元31获取所述第二应用的性能缺陷反馈结果,具体为发送数据获取请求至第二应用对应的服务器;所述数据获取请求用于请求所述第二应用对应的服务器反馈所述第二应用的性能缺陷反馈结果;接收所述第二应用对应的服务器发送的所述第二应用的性能缺陷反馈结果。
可见,图3所示的实施例中,电子设备可以利用该第一应用的测试参考数据训练预设模型,得到性能预测模型,并利用该性能预测模型对该第二应用的性能测试结果进行分析,得到与该第二应用的性能测试结果对应的目标性能测试评价、该第二应用上线后的性能缺陷预测结果,以自动化智能化地分析应用的性能测试结果,从而提高对性能测试结果的分析效率。
请参阅图4,为本申请实施例提供的一种电子设备的结构示意图。其中,本实施例中所描述的电子设备可以包括:一个或多个处理器1000和存储器2000。可选地,该电子设备还可以包括一个或多个输入设备3000、一个或多个输出设备4000。处理器1000、存储器2000、输入设备3000、输出设备4000相互之间可以通过总线连接。
输入设备3000、输出设备4000可以是标准的有线或无线通信接口。
处理器1000可以是中央处理模块(Central Processing Unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器2000可以是高速RAM存储器,也可为非不稳定的存储器(non-volatile memory),例如磁盘存储器。存储器2000用于存储一组程序代码,输入设备3000、输出设备4000和 处理器1000可以调用存储器2000中存储的程序代码。具体地:
处理器1000,用于获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;获取第二应用的性能测试结果,所述第二应用为未上线应用;利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
可选地,处理器1000,还用于当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果;判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致;当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正;其中,所述第二应用的测试参考数据包括:所述第二应用的性能测试结果、与所述第二应用的性能测试结果对应的性能测试评价和所述第二应用上线后的性能缺陷反馈结果。
可选地,处理器1000判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为若所述性能缺陷预测结果指示所述第二应用在上线后会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后未发生性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致;或,若所述性能缺陷预测结果指示所述第二应用在上线后不会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后发生了性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
可选地,所述第二应用上线后的性能缺陷预测结果包括第一缺陷类别,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷类别,处理器1000判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷类别与所述第二缺陷类别是否相同;当所述第一缺陷类别与所述第二缺陷类别不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
可选地,处理器1000,还用于查询预设的缺陷类别集合是否包括所述第二缺陷类别; 所述缺陷类别集合包括多个预设的缺陷类别;当所述缺陷类别集合包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的操作。
可选地,所述第二应用上线后的性能缺陷预测结果包括第一缺陷等级,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷等级,处理器1000判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷等级与所述第二缺陷等级是否相同;当所述第一缺陷等级与所述第二缺陷等级不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
可选地,处理器1000获取所述第二应用的性能缺陷反馈结果,具体为通过输出设备4000发送数据获取请求至第二应用对应的服务器;所述数据获取请求用于请求所述第二应用对应的服务器反馈所述第二应用的性能缺陷反馈结果;通过输入设备3000接收所述第二应用对应的服务器发送的所述第二应用的性能缺陷反馈结果。
具体实现中,本申请实施例中所描述的处理器1000、输入设备3000、输出设备4000可执行图1-图2实施例所描述的实现方式,也可执行本申请实施例所描述的实现方式,在此不再赘述。
在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以是两个或两个以上模块集成在一个模块中。上述集成的模块既可以采样硬件的形式实现,也可以采样软件功能模块的形式实现。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机非易失性可读存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的计算机非易失性可读存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本申请一种较佳实施例而已,当然不能以此来限定本申请之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本申请权利要求所作的等同变化,仍属于本申请所涵盖的范围。

Claims (20)

  1. 一种基于智能决策的测试结果分析方法,其特征在于,包括:
    获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
    利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
    获取第二应用的性能测试结果,所述第二应用为未上线应用;
    利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果;
    判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致;
    当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正;
    其中,所述第二应用的测试参考数据包括:所述第二应用的性能测试结果、与所述第二应用的性能测试结果对应的性能测试评价和所述第二应用上线后的性能缺陷反馈结果。
  3. 根据权利要求2所述的方法,其特征在于,所述判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,包括:
    若所述性能缺陷预测结果指示所述第二应用在上线后会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后未发生性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致;或,
    若所述性能缺陷预测结果指示所述第二应用在上线后不会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后发生了性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  4. 根据权利要求2所述的方法,其特征在于,所述第二应用上线后的性能缺陷预测结果包括第一缺陷类别,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷类别,所述判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,包括:
    判断所述第一缺陷类别与所述第二缺陷类别是否相同;
    当所述第一缺陷类别与所述第二缺陷类别不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    查询预设的缺陷类别集合是否包括所述第二缺陷类别;所述缺陷类别集合包括多个预设的缺陷类别;
    当所述缺陷类别集合包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的步骤。
  6. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    根据预设的性能测试指标与缺陷类别的对应关系,查询出所述第二应用的性能测试结果对应的测试指标,所对应的缺陷类别;
    判断查询出的缺陷类别是否包括所述第二缺陷类别;
    当所述查询出的缺陷类别包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的步骤。
  7. 根据权利要求2所述的方法,其特征在于,所述第二应用上线后的性能缺陷预测结果包括第一缺陷等级,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷等级,所述判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,包括:
    判断所述第一缺陷等级与所述第二缺陷等级是否相同;
    当所述第一缺陷等级与所述第二缺陷等级不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  8. 根据权利要求2-7任意一项所述的方法,其特征在于,所述获取所述第二应用的性能缺陷反馈结果,包括:
    发送数据获取请求至第二应用对应的服务器;所述数据获取请求用于请求所述第二应用对应的服务器反馈所述第二应用的性能缺陷反馈结果;
    接收所述第二应用对应的服务器发送的所述第二应用的性能缺陷反馈结果。
  9. 一种基于智能决策的测试结果分析装置,其特征在于,包括:
    获取单元,用于获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
    处理单元,用于利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
    所述获取单元,还用于获取第二应用的性能测试结果,所述第二应用为未上线应用;
    所述处理单元,还用于利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
  10. 根据权利要求9所述的装置,其特征在于,
    所述获取单元,还用于当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果;
    所述处理单元,还用于判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致;当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正;其中,所述第二应用的测试参考数据包括:所述第二应用的性能测试结果、与所述第二应用的性能测试结果对应的性能测试评价和所述第二应用上线后的性能缺陷反馈结果。
  11. 根据权利要求10所述的装置,其特征在于,所述处理单元判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为若所述性能缺陷预测结果指示所述第二应用在上线后会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后未发生性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致;或,若所述性能缺陷预测结果指示所述第二应用在上线后不会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后发生了性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  12. 根据权利要求10所述的装置,其特征在于,所述第二应用上线后的性能缺陷预测结果包括第一缺陷类别,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷类别,所 述处理单元判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷类别与所述第二缺陷类别是否相同;当所述第一缺陷类别与所述第二缺陷类别不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  13. 根据权利要求12所述的装置,其特征在于,所述处理单元,还用于查询预设的缺陷类别集合是否包括所述第二缺陷类别;所述缺陷类别集合包括多个预设的缺陷类别;当所述缺陷类别集合包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的操作。
  14. 根据权利要求12所述的装置,其特征在于,所述处理单元,还用于根据预设的性能测试指标与缺陷类别的对应关系,查询出所述第二应用的性能测试结果对应的测试指标,所对应的缺陷类别;判断查询出的缺陷类别是否包括所述第二缺陷类别;当所述查询出的缺陷类别包括所述第二缺陷类别时,触发所述判断所述第一缺陷类别与所述第二缺陷类别是否相同的操作。
  15. 根据权利要求10所述的装置,其特征在于,所述第二应用上线后的性能缺陷预测结果包括第一缺陷等级,所述第二应用上线后的性能缺陷反馈结果包括第二缺陷等级,所述处理单元判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为判断所述第一缺陷等级与所述第二缺陷等级是否相同;当所述第一缺陷等级与所述第二缺陷等级不相同时,确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  16. 根据权利要求10-15任意一项所述的装置,其特征在于,所述获取单元获取所述第二应用的性能缺陷反馈结果,具体为发送数据获取请求至第二应用对应的服务器;所述数据获取请求用于请求所述第二应用对应的服务器反馈所述第二应用的性能缺陷反馈结果;接收所述第二应用对应的服务器发送的所述第二应用的性能缺陷反馈结果。
  17. 一种电子设备,其特征在于,包括处理器和存储器,所述处理器和存储器相互连接,其中,所述存储器用于存储计算机程序,所述计算机程序包括程序指令,所述处理器被配置用于调用所述程序指令,执行:
    获取第一应用的测试参考数据,所述第一应用为已上线应用;所述第一应用的测试参考数据包括所述第一应用的性能测试结果、与所述第一应用的性能测试结果对应的性能测试评价和所述第一应用上线后的性能缺陷反馈结果;
    利用所述第一应用的测试参考数据训练预设模型,得到性能预测模型;
    获取第二应用的性能测试结果,所述第二应用为未上线应用;
    利用所述性能预测模型对所述第二应用的性能测试结果进行分析,得到与所述第二应用的性能测试结果对应的目标性能测试评价、所述第二应用上线后的性能缺陷预测结果。
  18. 根据权利要求17所述的电子设备,其特征在于,所述处理器,还用于当所述第二应用的上线时长超过预设时长时,获取所述第二应用上线后的性能缺陷反馈结果;判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致;当所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致时,利用所述第二应用的测试参考数据,训练所述性能预测模型,以对所述性能预测模型进行修正;其中,所述第二应用的测试参考数据包括:所述第二应用的性能测试结果、与所述第二应用的性能测试结果对应的性能测试评价和所述第二应用上线后的性能缺陷反馈结果。
  19. 根据权利要求18所述的电子设备,其特征在于,所述处理器判断所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果是否一致,具体为若所述性能缺陷预测结果指示所述第二应用在上线后会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后未发生性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致;或,若所述性能缺陷预测结果指示所述第二应用在上线后不会发生性能缺陷,且所述性能缺陷反馈结果指示所述第二应用在上线后发生了性能缺陷,则确定所述第二应用上线后的性能缺陷预测结果与所述第二应用上线后的性能缺陷反馈结果不一致。
  20. 一种计算机非易失性可读存储介质,其特征在于,所述计算机非易失性可读存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行如权利要求1-8任一项所述的方法。
PCT/CN2019/118452 2019-05-20 2019-11-14 基于智能决策的测试结果分析方法及相关装置 WO2020233021A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910420838.9A CN110232020A (zh) 2019-05-20 2019-05-20 基于智能决策的测试结果分析方法及相关装置
CN2019104208389 2019-05-20

Publications (1)

Publication Number Publication Date
WO2020233021A1 true WO2020233021A1 (zh) 2020-11-26

Family

ID=67861434

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/118452 WO2020233021A1 (zh) 2019-05-20 2019-11-14 基于智能决策的测试结果分析方法及相关装置

Country Status (2)

Country Link
CN (1) CN110232020A (zh)
WO (1) WO2020233021A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110232020A (zh) * 2019-05-20 2019-09-13 平安普惠企业管理有限公司 基于智能决策的测试结果分析方法及相关装置
CN111177011A (zh) * 2020-01-02 2020-05-19 腾讯科技(深圳)有限公司 软件免测的预测方法、装置、设备及存储介质
CN112540919B (zh) * 2020-12-08 2024-02-23 上海哔哩哔哩科技有限公司 测试设备确定方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832219A (zh) * 2017-11-13 2018-03-23 北京航空航天大学 基于静态分析和神经网络的软件故障预测技术的构建方法
US20180300227A1 (en) * 2017-04-12 2018-10-18 Salesforce.Com, Inc. System and method for detecting an error in software
CN109240929A (zh) * 2018-09-18 2019-01-18 百度在线网络技术(北京)有限公司 软件质量预测方法、装置、终端和计算机可读存储介质
US20190065343A1 (en) * 2017-08-29 2019-02-28 Fmr Llc Automated Log Analysis and Problem Solving Using Intelligent Operation and Deep Learning
CN109446090A (zh) * 2018-10-31 2019-03-08 南开大学 基于深度神经网络和概率决策森林的软件缺陷预测模型
CN109634833A (zh) * 2017-10-09 2019-04-16 北京京东尚科信息技术有限公司 一种软件缺陷预测方法和装置
CN110232020A (zh) * 2019-05-20 2019-09-13 平安普惠企业管理有限公司 基于智能决策的测试结果分析方法及相关装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105843743B (zh) * 2016-04-11 2018-10-02 南京邮电大学 一种特殊自动化测试用例实际输出结果正确性的验证方法
CN108874665A (zh) * 2018-05-29 2018-11-23 百度在线网络技术(北京)有限公司 一种测试结果校验方法、装置、设备和介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180300227A1 (en) * 2017-04-12 2018-10-18 Salesforce.Com, Inc. System and method for detecting an error in software
US20190065343A1 (en) * 2017-08-29 2019-02-28 Fmr Llc Automated Log Analysis and Problem Solving Using Intelligent Operation and Deep Learning
CN109634833A (zh) * 2017-10-09 2019-04-16 北京京东尚科信息技术有限公司 一种软件缺陷预测方法和装置
CN107832219A (zh) * 2017-11-13 2018-03-23 北京航空航天大学 基于静态分析和神经网络的软件故障预测技术的构建方法
CN109240929A (zh) * 2018-09-18 2019-01-18 百度在线网络技术(北京)有限公司 软件质量预测方法、装置、终端和计算机可读存储介质
CN109446090A (zh) * 2018-10-31 2019-03-08 南开大学 基于深度神经网络和概率决策森林的软件缺陷预测模型
CN110232020A (zh) * 2019-05-20 2019-09-13 平安普惠企业管理有限公司 基于智能决策的测试结果分析方法及相关装置

Also Published As

Publication number Publication date
CN110232020A (zh) 2019-09-13

Similar Documents

Publication Publication Date Title
WO2020233021A1 (zh) 基于智能决策的测试结果分析方法及相关装置
WO2021174694A1 (zh) 基于数据中心的运维监控方法、装置、设备及存储介质
AU2020200909A1 (en) Evaluation control
WO2019100576A1 (zh) 自动化测试管理方法、装置、终端设备及存储介质
WO2017107794A1 (zh) 风险识别方法及装置
CN110704231A (zh) 一种故障处理方法及装置
CN111309539A (zh) 一种异常监测方法、装置和电子设备
CN110647447B (zh) 用于分布式系统的异常实例检测方法、装置、设备和介质
CN109934433A (zh) 一种人员能力评估方法、装置及云服务平台
CN111611172A (zh) 项目测试缺陷分析方法、装置、设备及存储介质
WO2019052169A1 (zh) 坐席监控方法、装置、设备及计算机可读存储介质
CN111176953A (zh) 一种异常检测及其模型训练方法、计算机设备和存储介质
WO2022213565A1 (zh) 一种人工智能模型预测结果的复审方法及装置
CN108805332B (zh) 一种特征评估方法和装置
CN111222968A (zh) 一种企业税务风险管控方法及系统
WO2022134348A1 (zh) 一种监控软件开发过程的方法、装置、终端及存储介质
WO2018098670A1 (zh) 一种进行数据处理的方法和装置
CN112948262A (zh) 一种系统测试方法、装置、计算机设备和存储介质
CN105162931A (zh) 一种通信号码的分类方法及装置
CN112434717B (zh) 一种模型训练方法及装置
CN117149565A (zh) 云平台关键性能指标的状态检测方法、装置、设备及介质
CN111611973A (zh) 目标用户识别的方法、装置及存储介质
WO2023060954A1 (zh) 数据处理与数据质检方法、装置及可读存储介质
US20220327450A1 (en) Method for increasing or decreasing number of workers and inspectors in crowdsourcing-based project for creating artificial intelligence learning data
WO2023103344A1 (zh) 一种数据处理方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19929778

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19929778

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 18/03/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19929778

Country of ref document: EP

Kind code of ref document: A1