WO2019085061A1 - 基金系统自动化测试管理方法、装置、设备及存储介质 - Google Patents

基金系统自动化测试管理方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2019085061A1
WO2019085061A1 PCT/CN2017/112221 CN2017112221W WO2019085061A1 WO 2019085061 A1 WO2019085061 A1 WO 2019085061A1 CN 2017112221 W CN2017112221 W CN 2017112221W WO 2019085061 A1 WO2019085061 A1 WO 2019085061A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
test case
target
cases
vulnerability
Prior art date
Application number
PCT/CN2017/112221
Other languages
English (en)
French (fr)
Inventor
伍朗
伍振亮
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2019085061A1 publication Critical patent/WO2019085061A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Definitions

  • the present application relates to the field of automated testing technologies, and in particular, to a fund system automated test management method, apparatus, device, and storage medium.
  • the purpose of the application is to provide a fund system automatic test management method, device, device and storage medium, and realize resource sharing by integrating test cases in different systems into one system.
  • the application is implemented in this way.
  • the first aspect of the present application provides a fund system automatic test management method, and the fund system automatic test management method includes:
  • test case database is filtered according to the input information, and the test case is recommended after traversing the test case database;
  • a test report is automatically generated after the execution of the target test case is completed.
  • the second aspect of the present application provides a fund system automatic test management device, where the fund system automatic test management device includes:
  • a database creation module that creates a test case database based on newly entered code and test cases stored in multiple test systems
  • test case recommendation module configured to filter the test case database according to the input information, and recommend the test case after traversing the test case database
  • test case execution module configured to obtain a target test case from the recommended test case according to a user's selection instruction, execute the target test case, and track the target test case;
  • the test report generation module is configured to automatically generate a test report after executing the target test case.
  • a third aspect of the present application provides a terminal device including a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, the processor executing the computer readable instructions Implement the following steps:
  • test case database is filtered according to the input information, and the test case is recommended after traversing the test case database;
  • a test report is automatically generated after the execution of the target test case is completed.
  • a fourth aspect of the present application provides a computer readable storage medium storing computer readable instructions that, when executed by a processor, implement the following steps:
  • test case database is filtered according to the input information, and the test case is recommended after traversing the test case database;
  • a test report is automatically generated after the execution of the target test case is completed.
  • the application provides a fund system automatic test management method, device, device and storage medium, and creates a test case database according to newly entered code and test cases stored in a plurality of test systems, and filters and traverses the test case database according to the input information.
  • the target test case is found, the test case is tracked during the execution process, the test report is automatically generated after the test case is executed, the resource sharing in each test system is realized, and the test project, the tester and the test system are facilitated. Monitoring and management of defects
  • FIG. 1 is a flowchart of a fund system automatic test management method provided by an embodiment of the present application
  • step S10 is a specific flowchart provided by an implementation manner of step S10 in a fund system automatic test management method provided by an embodiment of the present application;
  • step S20 is a step in a fund system automatic test management method provided by an embodiment of the present application a specific flowchart of step S20;
  • step S203 is a specific flowchart provided by another implementation manner of step S203 in the fund system automatic test management method provided by an embodiment of the present application;
  • FIG. 5 is a schematic structural diagram of a fund system automatic test management apparatus according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of a specific structure provided by an implementation manner of a database creation module in a fund system automatic test management apparatus according to another embodiment of the present application;
  • FIG. 7 is a schematic structural diagram of a terminal device according to another embodiment of the present application.
  • the embodiment of the present application provides a fund system automatic test management method, as shown in FIG. 1 , the fund system automatic test management method includes:
  • Step S10 Create a test case database based on the newly entered code and the test cases stored in the plurality of test systems.
  • a method of creating a test case database is created based on test cases and versions already existing in a plurality of different test systems.
  • the test cases in the original three systems are written in C language, C++ language and Java language respectively.
  • the test case database is completed and the test case is created.
  • Database when testers want to choose test cases in different languages, they can search in the test case database, avoiding the problem of redeveloping another language when there is only one language in a single test system; the other is to create test cases.
  • the database method is a new test case formed by combining the newly entered code with the original test case. It is necessary to judge the language of the newly entered code, and the test is based on the language of the code.
  • the test cases in the same language are searched according to the function of the test case and the newly entered code, and added to the test case database. Of course, the newly entered code formation test case may be added to the test case.
  • the database is a new test case formed by combining the newly entered code with the original test case. It is
  • Step S20 Filtering the test case database according to the input information, and recommending the test case after traversing the test case database.
  • step S20 when the tester wants to perform the test task, the test case is searched for the required test case by inputting the information, and the input information may be input query information according to the test function, and the query information may be a single function or multiple The choice or superposition of functions, of course, the information entered can also be the name or version number of the test case.
  • step S20 for example, when the input function information is acquired, it is automatically searched in the test case database, and after traversing all the test cases, the test cases with the same function and similar functions are selected, and queued according to the correlation with the input information.
  • test cases of different languages have this function, test cases of all languages with this function are displayed, and can be sorted according to the language category, that is, the test cases of the same language are arranged in a group, and the tester selects to execute from the queue. Test cases tested.
  • This kind of implementation can break the barriers of writing test case languages and not sharing test cases, and realize the sharing of test cases in different languages. It can also be sorted according to the execution test of test cases.
  • the specific sorting method can be formulated according to the needs of users.
  • Step S30 Obtain a target test case from the recommended test case according to the user's selection instruction, execute the target test case, and track the target test case.
  • tracking the target test case refers to tracking the vulnerability of the discovered test system during the test system, and marking the vulnerability, and the tracking result can be displayed through a separate report or in the test. Displayed in the report.
  • Step S40 A test report is automatically generated after the target test case is executed.
  • the test report can display the test results of each element in the test process.
  • the performer who performs the automated test is the tester, and the test process is mainly associated with the test case and the test item, so the process of generating the test report is performed.
  • the main focus is on these three Content is generated.
  • the tester's test quality and number of tests are used as a basis for testers' performance appraisal.
  • the test report is automatically generated, including: automatically generating a project quality test report, a project quality change report, and a self-test quality ranking report after executing the test case, wherein the project quality
  • the test report report includes a data analysis part, a defect list part, and a risk assessment and summary part
  • the item quality change report displays the quality change of each item in a timeline manner
  • the self-test quality ranking report is displayed in the form of point ranking.
  • the quality of the project test member test is high. In this step, various reports generated by the test results are convenient for monitoring the quality of the project and the quality of the project as time changes, and provide a basis for the test personnel's assessment.
  • the embodiment of the present application provides an automated test management method, which creates a test case database according to newly entered code and test cases stored in multiple test systems, filters the test case database according to the input information, and traverses the test case database to find a target test.
  • Use cases the test cases are tracked during the execution process, the test reports are automatically generated after the test cases are executed, the resource sharing in each test system is realized, and the defects of the test items, testers and test systems are monitored and managed.
  • the test case database is created based on the newly entered code and the test cases stored in the plurality of test systems;
  • Step S101 Import the existing test case and its script into the test case database, identify the language of the imported test case and its script, and form different groups in the test case database according to the test case and the language category of the script.
  • step S101 the test cases in different systems are imported into the test case database to concentrate the test cases, and in a system, by grouping the test cases in different languages, the mutual code between the parts of the test cases of the same group can be realized. Call, the overall call to the test case between different groups.
  • Step S102 retrieve a test case including a specific function according to the retrieval instruction, call a partial code corresponding to the specific function in the test case, and merge with the newly entered code to form a new test case and store the group in the same language as the newly entered code.
  • step S102 when the user writes a test case, and then writes a part of the code and then writes a code of a certain function, the search instruction can be searched in the database by the search instruction and the test case including the function is called, and the test case is called and the function is called.
  • the corresponding part of the code is combined with the newly entered code to form a new test case and stored in the corresponding group.
  • the new test case has the function of the original test case and the added function.
  • the method for creating a test case database is created by two aspects: by importing an existing test case and its version, and by creating a new test case by incorporating new code, when creating a test case database.
  • the code of the test case between the same group can be called, and the test cases between different groups are called as a whole.
  • the tester can traverse the test case database through the filter condition, find the target test case, and realize resource sharing.
  • test case database is filtered according to the input information, the test case is recommended after traversing the test case database, and the target is obtained according to the user's selection instruction.
  • Test cases, tracking target test cases during the execution of target test cases including:
  • Step S201 Searching for the test case in the test case database according to the input function information, sorting the test cases according to the running times of the test cases, and forming the recommended test cases by completing the sorted test cases, and acquiring the target test cases according to the user's selection instruction.
  • the test cases have the same function in different test groups in the test case database, and the test cases are classified according to the number of runs, and are sorted according to the ranks when screening. Based on the recommendation mechanism, the test cases are ranked by the number of runs of the history, wherein the number of times the history is run is based on the history of the logged in user or based on the history in the entire system.
  • the drop-down box is displayed during the screening, and the recommended test functions are queued from top to bottom according to the classified level. Convenient for testers to choose.
  • Step S202 For the reception test item, when more than one test case is to be executed by the item to be tested, Iterate over the test case.
  • step S202 the project to be tested is docked.
  • the tester selects multiple test cases at the same time, the test cases are iteratively executed in the selected order.
  • you can design a loop test each cycle time can be recorded, and the number of cycles is also adjustable.
  • Step S203 Track the target test case during the test, and determine the vulnerability of the system under test and optimize the target test case by tracking.
  • the test cases are tracked during the test, including tracking of defects found during the test process and tracking to determine which test cases have found project defects.
  • the tracking of the vulnerability can be either passive vulnerability exploitation or active vulnerability mining.
  • the passive vulnerability exploitation refers to when a test case triggers a software vulnerability, and the software vulnerability is discovered.
  • the active vulnerability exploitation is The test case itself does not trigger a vulnerability, but in the process of execution, a constraint is generated by some method, and a new test case is generated by solving the constraint, and the new test case can trigger the vulnerability.
  • Passive vulnerability exploitation is suitable for detecting loop-invariant vulnerabilities, while for path-changing vulnerabilities, proactive detection is required.
  • This embodiment facilitates the tester to select test cases and supervise the test process by designing an operation mechanism and a tracking mechanism during the execution of the test.
  • step S203 As an implementation manner of step S203 in the foregoing embodiment, as shown in FIG. 4, the target test case is tracked during the testing process, and the vulnerability of the system under test is determined by tracking, and then includes:
  • step S2031 the vulnerability is managed, and the vulnerability is automatically synchronized to the vulnerability list.
  • step S2031 the vulnerability is managed, the test case execution result is generated, and the vulnerability is automatically synchronized to the vulnerability list.
  • Each tester uses the system to find that the vulnerabilities are automatically synchronized in the list during the test to achieve unified management of the vulnerabilities.
  • test equipment and vulnerability management need to be established.
  • the connection between the platforms in order to achieve the purpose of quickly sending vulnerabilities.
  • Each tester can use the vulnerability management platform link included in the test requirements to establish a connection between the test device and the vulnerability management platform, and the method of establishing a connection between the test device and the vulnerability management platform can be a test device browsing through A vulnerability management platform link is run on the device to connect to the vulnerability management platform.
  • Step S2032 obtaining a vulnerability in the vulnerability list and generating a system vulnerability report, and providing corresponding improvement measures according to the system vulnerability report.
  • step S2033 the system under test is adjusted according to the code input for the improvement measure and retested until the bug is repaired.
  • the flow direction of the stain data is tracked, and the stain propagation theory is used to analyze and judge whether the threat function has a phenomenon of stain propagation. If it occurs, it indicates that the source code may exist.
  • the security threat of a buffer overflow vulnerability that needs to be improved.
  • x ⁇ dstsi the dstsi is a destination operand set of the program instruction i
  • src[x]i is for the program instruction i and
  • the destination operand is a set of source operands of x; if x ⁇ dstsi ⁇ (y ⁇ src[x]i
  • This adding operation is called stain propagation; if x ⁇ dstsi ⁇ (y ⁇ src[x]i
  • y ⁇ T) ⁇ , the destination operand x is removed from the stain set T, and the vulnerability is improved and then retested.
  • the software corresponding improvement measures
  • the software is modified, re-tested, and the test results are fed back to the vulnerability management platform until the relevant system defects are successfully repaired.
  • the target test case is tracked during the testing process, and the target test case is optimized by tracking, including:
  • test case Duplicate and/or redundant test cases are eliminated based on the event and event paths covered by the test case.
  • test cases mean that the control and control sequences passed by the two test cases are exactly the same. Obviously, from the perspective of coverage, the two test cases are repeated, and one can be arbitrarily eliminated.
  • a redundant test case is one in which two test cases have an inclusion relationship, that is, one test case contains another test case.
  • the test case is optimized in this application using a greedy algorithm based on directed paths. The biggest difference between this algorithm and the ordinary greedy algorithm is the introduction of the concept of directed graphs, that is, the algorithm deals with directed graphs, while the normal greedy algorithm deals with isolated points or edges.
  • the algorithm mainly includes the following steps:
  • the method includes:
  • the import, add, write, delete, and execute permissions of the test case are configured to different roles, and then assigned to the account, thereby realizing clear rights and responsibilities, and facilitating the tester to track.
  • the fund system automatic test management device 40 includes:
  • the database creation module 401 creates a test case database based on the newly entered code and the test cases stored in the plurality of test systems;
  • the test case recommendation module 402 is configured to filter the test case database according to the input information, and recommend the test case after traversing the test case database.
  • the test case execution module 403 obtains the target test case from the recommended test case according to the user's selection instruction, and tracks the target test case during the execution of the target test case;
  • the test report generation module 404 is configured to automatically generate a test report after executing the target test case.
  • the database creation module 401 includes:
  • the grouping unit 410 is configured to import the existing test case and its script into the test case database, identify the language of the imported test case and its script, and form different groups in the test case database according to the test case and the language category of the script. ;
  • the calling unit 420 is configured to retrieve a test case including a specific function according to the retrieval instruction, call a partial code corresponding to the specific function in the test case, and merge with the newly entered code to form a new test case and store the packet in the same language.
  • test case recommendation module 402 is further configured to search for test cases in the test case database according to the input function information, and sort the test cases according to the running times of the test cases, and The test cases that have completed the sorting form a recommended test case, and the target test case is obtained according to the user's selection instruction.
  • the test case execution module 403 is further configured to perform a test item, and when the to-be-tested item is to execute more than one target test case, iteratively execute the target test case, and perform the target test case during the test process. Track and identify vulnerabilities in the system under test and optimize the target test cases.
  • Another embodiment of the present application provides a computer readable storage medium having stored thereon computer readable instructions that, when executed by a processor, implement the fund system automated test management in the above embodiments The method, in order to avoid repetition, will not be described here.
  • the computer readable instructions when executed by the processor, the functions of the modules/units in the fund system automatic test management device in the above embodiments are implemented. To avoid repetition, details are not described herein again.
  • Fig. 7 is a schematic diagram of a terminal device in this embodiment.
  • the terminal device 6 includes a processor 60, a memory 61, and computer readable instructions 62 stored in the memory 61 and operable on the processor 60.
  • the processor 60 executes the computer readable instructions 62 to implement the various steps of the fund system automated test management method of the above-described embodiments, such as steps S10, S20, and S30 shown in FIG.
  • processor 60 when executing computer readable instructions 62, implements the functions of the various modules/units of the fund system automated test management device of the above-described embodiments.
  • computer readable instructions 62 may be partitioned into one or more modules/units, one or more modules/units being stored in memory 61 and executed by processor 60 to complete the application.
  • the one or more modules/units may be a series of computer readable instruction segments capable of performing a particular function for describing the execution of computer readable instructions 62 in the terminal device 6.
  • computer readable instructions 62 may be partitioned into functions of database creation module 401, test case recommendation module 402, test case execution module 403, and test report generation module 404 as shown in FIG.
  • the terminal device 6 can be a computing device such as a desktop computer, a notebook, a palmtop computer, and a cloud server.
  • the terminal device may include, but is not limited to, the processor 60, the memory 61. It will be understood by those skilled in the art that FIG. 7 is only an example of the terminal device 6, and does not constitute a limitation of the terminal device 6, and may include more or less components than those illustrated, or combine some components, or different components.
  • the terminal device may further include an input/output device, a network access device, a bus, and the like.
  • the so-called processor 60 may be a central processing unit (CPU), or may be other general-purpose processors, a digital signal processor (DSP), an application specific integrated circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the memory 61 may be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6.
  • the memory 61 may also be an external storage device of the terminal device 6, such as a plug-in hard disk provided on the terminal device 6, a smart memory card (SMC), and a secure digital (Secure Digital, SD) card, flash card, etc. Further, the memory 61 may also include both an internal storage unit of the terminal device 6 and an external storage device.
  • the memory 61 is used to store computer readable instructions and other programs and data required by the terminal device.
  • the memory 61 can also be used to temporarily store data that has been output or is about to be output.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated modules/units if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
  • the present application implements all or part of the processes in the foregoing embodiments, and may also be implemented by computer readable instructions, which may be stored in a computer readable storage medium.
  • the computer readable instructions when executed by a processor, may implement the steps of the various method embodiments described above.
  • the computer readable instructions comprise computer readable instruction code, which may be in the form of source code, an object code form, an executable file or some intermediate form or the like.
  • the computer readable medium can include any entity or device capable of carrying the computer readable instruction code, a recording medium, a USB flash drive, a removable hard drive, a magnetic disk, an optical disk, a computer memory, a read only memory (ROM, Read-Only) Memory), random access memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media.
  • a recording medium a USB flash drive
  • a removable hard drive a magnetic disk, an optical disk
  • a computer memory a read only memory (ROM, Read-Only) Memory
  • RAM random access memory

Abstract

一种基金系统自动化测试管理方法、装置、设备及存储介质,自动化测试管理方法包括:基于多个测试系统中存储的测试用例创建测试用例数据库(S10);根据输入的信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例(S20),根据用户的选择指令获取目标测试用例,执行所述目标测试用例过程中对所述目标测试用例进行跟踪(S30);执行完所述目标测试用例后自动生成测试报表(S40),通过把不同语言的测试用例整合在一个系统,测试人员可以通过筛选条件遍历测试用例数据库,找到目标测试用例,实现资源共享,另外生成的自动化报表可以实现对项目、人员的监控考核。

Description

基金系统自动化测试管理方法、装置、设备及存储介质
本专利申请以2017年10月31日提交的申请号为201711045922.4,名称为“基金系统自动化测试管理方法、装置、设备及存储介质”的中国发明专利申请为基础,并要求其优先权。
技术领域
本申请涉及自动化测试技术领域,尤其涉及一种基金系统自动化测试管理方法、装置、设备及存储介质。
背景技术
在基金行业中,基金公司存在很多不同的系统来开展其业务,由于工作量的原因,这些系统都是由不同的测试人员负责测试的。这些测试人员按照自己的方式,实现了系统自动化,这些自动化系统使用着不同的语言,采用不同的实现方式,在每次运行之前都需要特定的人去做特定的设置和环境搭建,依赖于各个系统的测试人员。此外,这些测试代码的管理也是由每个测试人员管理的,不同系统之间的测试代码之间肯定会存在一些相似的代码,由于没有统一的管理,无法将这些代码进行共享的,造成开发代码的人力浪费,并且降低测试代码的可维护性。
发明内容
本申请的目的在于提供一种基金系统自动化测试管理方法、装置、设备及存储介质,通过将不同系统中的测试用例整合在一个系统,实现资源共享。
本申请是这样实现的,本申请第一方面提供一种基金系统自动化测试管理方法,所述基金系统自动化测试管理方法包括:
基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
执行完成所述目标测试用例后自动生成测试报表。
本申请第二方面提供一种基金系统自动化测试管理装置,所述基金系统自动化测试管理装置包括:
数据库创建模块,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
测试用例推荐模块,用于根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
测试用例执行模块,用于根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
测试报表生成模块,用于执行完成所述目标测试用例后自动生成测试报表。
本申请第三方面提供一种终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时实现如下步骤:
基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
执行完成所述目标测试用例后自动生成测试报表。
本申请第四方面提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可读指令,所述计算机可读指令被处理器执行时实现如下步骤:
基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
执行完成所述目标测试用例后自动生成测试报表。
本申请提供一种基金系统自动化测试管理方法、装置、设备及存储介质,根据新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库,根据输入信息对测试用例数据库进行筛选,遍历测试用例数据库后找到目标测试用例,执行过程中对测试用例进行跟踪,执行完测试用例后自动生成测试报表,实现了各测试系统中的资源共享,并方便了对测试项目、测试人员和测试系统的缺陷的监控和管理
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一种实施例提供的一种基金系统自动化测试管理方法的流程图;
图2是本申请一种实施例提供的一种基金系统自动化测试管理方法中的步骤S10的一种实施方式提供的具体流程图;
图3是本申请一种实施例提供的一种基金系统自动化测试管理方法中的步 骤S20的具体流程图;
图4是本申请一种实施例提供的一种基金系统自动化测试管理方法中的步骤S203的另一种实施方式提供的具体流程图;
图5是本申请另一种实施例提供的一种基金系统自动化测试管理装置的结构示意图;
图6是本申请另一种实施例提供的一种基金系统自动化测试管理装置中的数据库创建模块的一种实施方式提供的具体结构示意图;
图7是本申请另一种实施例提供的终端设备的结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
为了说明本申请的技术方案,下面通过具体实施例来进行说明。
本申请实施例提供一种基金系统自动化测试管理方法,如图1所示,该基金系统自动化测试管理方法包括:
步骤S10.基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库。
在步骤S10中,一种创建测试用例数据库方式是基于原有的多个不同的测试系统中已经存在的测试用例和版本而创建的。例如,原有的三个系统中测试用例是分别用的是C语言、C++语言和Java语言编写的,通过获取三个系统中的测试用例和版本,放入测试用例数据库即完成了创建测试用例数据库,当测试人员想选择不同语言的测试用例时,可以在该测试用例数据库进行寻找,避免了单一的测试系统中仅有一种语言时重新开发另一种语言的问题;另一种创建测试用例数据库的方式是根据新录入的代码与原有的测试用例进行结合形成的新的测试用例,需要判断新录入的代码的语言,根据该代码的语言在测试用 例数据库中寻找相同语言的测试用例,在根据测试用例的功能与新录入的代码进行稽核,将其加入到测试用例数据库中,当然,也可以是将新录入的代码形成测试用例加入到测试用例数据库中。
步骤S20.根据输入信息对测试用例数据库进行筛选,遍历测试用例数据库后推荐测试用例。
在步骤S20中,测试人员要执行测试任务时,通过输入信息在测试用例数据库中查找需要的测试用例,输入信息可以为根据测试功能输入查询信息,查询信息可以是单个功能,也可以是多个功能的选择或者叠加,当然,输入的信息也可以是测试用例的名称或者版本号。
在步骤S20中,例如,当获取到输入的功能信息,在测试用例数据库中自动查找,遍历所有测试用例后,筛选出功能相同和相近的测试用例,按照与输入信息的相关度进行排队,当不同语言的测试用例均有该功能时,则显示具有该功能的所有语言的测试用例,并可以按照语言类别进行排序,即相同的语言的测试用例排列成一组,测试人员从队列中选择要执行测试的测试用例。这种实施方式,可以打破编写测试用例语言不同而不能共用测试用例的障碍,实现不同语言的测试用例的共享,也可以根据测试用例的执行测试进行排序,具体排序方法可以根据用户的需求制定。
步骤S30.根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行目标测试用例并对目标测试用例进行跟踪。
在步骤S30中,对目标测试用例进行跟踪是指在测试系统进行测试过程中对发现的测试系统的漏洞的跟踪,可以对漏洞进行标记,跟踪结果可以通过单独的报告进行展示,也可以在测试报表中进行展示。
步骤S40.执行完目标测试用例后自动生成测试报表。
在步骤S40中,测试报表可以显示测试过程中各要素的测试结果,例如,执行自动化测试的执行者为测试人员,测试过程中主要关联的有测试用例和测试项目,因此在生成测试报表的过程中,自动化生成的报表主要是围绕这三者 内容产生。例如,测试人员的测试质量和测试数量作为测试人员的绩效考核的一个依据。
作为一种实施方式,执行完目标测试用例后自动生成测试报表,包括:执行完所述测试用例后自动生成项目质量测试报表、项目质量变化报表和自测质量排名报表,其中,所述项目质量测试报告报表包括数据分析部分、缺陷列表部分和风险评估及总结部分,所述项目质量变化报表以时间线的方式展示各个项目的质量变化,所述自测质量排名报表以积分排名的方式展示出项目测试成员测试的质量高低。本步骤中,测试结果生成的各种报表,方便监控项目质量以及项目质量随着时间的变化,同时对测试人员的考核提供了依据。
本申请实施例提供一种自动化测试管理方法,根据新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库,根据输入信息对测试用例数据库进行筛选,遍历测试用例数据库后找到目标测试用例,执行过程中对测试用例进行跟踪,执行完测试用例后自动生成测试报表,实现了各测试系统中的资源共享,并方便了对测试项目、测试人员和测试系统的缺陷的监控和管理。
对于上述实施例中的步骤S10,作为一种实施方式,如图2所示,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;包括:
步骤S101.将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据所述测试用例及其脚本的语言类别在测试用例数据库中形成不同的分组。
在步骤S101中,将不同系统中的测试用例导入到测试用例数据库中将测试用例进行集中,在一个系统中通过对不同语言的测试用例分组,可以实现同组测试用例的部分代码之间的互相调用,不同分组之间的测试用例的整体调用。
步骤S102.根据检索指令检索包括特定功能的测试用例,调用测试用例中与特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在与新录入的代码相同语言的分组内。
在步骤S102中,当用户在编写测试用例时,输入部分代码后再编写某一功能的代码时,可以通过检索指令在数据库中查找具有与包括该功能的测试用例,调用测试用例中与该功能对应的部分代码,与新录入的代码合并形成新的测试用例并储存在相应的分组内,该新生的测试用例有原来测试用例的功能和添加的功能。
本实施方式提供的测试用例数据库创建方法,通过两方面来创建,一方面通过导入已存在的测试用例及其版本,一方面通过合入新的代码创建新的测试用例,在创建测试用例数据库时,通过识别测试用例的语言将其分为不同的组别,同组之间的测试用例的代码可以调用,不同组别之间的测试用例整体调用。测试人员可以通过筛选条件遍历测试用例数据库,找到目标测试用例,实现资源共享。
对于上述实施例中的步骤S20和步骤S30,作为一种实施方式,如图3所示,根据输入信息对测试用例数据库进行筛选,遍历测试用例数据库后推荐测试用例,根据用户的选择指令获取目标测试用例,执行目标测试用例过程中对目标测试用例进行跟踪,包括:
步骤S201.根据输入功能信息查找测试用例数据库中的测试用例,按照测试用例的运行次数对测试用例进行排序,并将完成排序的测试用例形成推荐测试用例,根据用户的选择指令获取目标测试用例。
在步骤S201中,测试用例数据库中不同组别之间测试用例的功能相同的,按运行次数对测试用例划分等级,进行筛选时按照等级进行排队。基于推荐机制,采用历史记录的运行次数对测试用例划分等级,其中,历史记录的运行次数基于登录用户的历史记录或者基于整个系统中的历史记录。进行筛选时显示下拉框,对推荐出来的测试功能相同按照划分的等级进行从上到下的排队。方便测试人员选择。
步骤S202.对接待测试项目,当待测试项目要执行一个以上的测试用例时, 对测试用例迭代执行。
在步骤S202中,对接需要测试的项目,当一个项目要执行两个以上的测试用例时,例如,测试人员同时选择了多个测试用例,则对测试用例按照选择顺序迭代执行。执行测试时,可以设计循环测试,每次循环时间可以被记录,并且循环次数也是可调的。
步骤S203.在测试过程中对目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞以及对目标测试用例进行优化。
在步骤S203中,在测试过程中对测试用例跟踪,跟踪包括对测试过程发现的缺陷的跟踪和通过跟踪确定是哪些测试用例发现了项目缺陷。该漏洞的跟踪发现可以是被动式漏洞发掘,也可以是主动式漏洞挖掘,被动式漏洞发掘指的是当某个测试用例触发了软件漏洞,那么该软件漏洞就被挖掘出来;而主动式漏洞发掘是指测试用例本身没有触发漏洞,但在执行过程中,通过某种方法产生了约束,通过求解该约束,产生新的测试用例,而该新测试用例能够触发漏洞。被动式漏洞发掘适合检测路径不变的漏洞,而对于路径变化的漏洞,则需采用主动式的检测方式。
本实施方式通过设计执行测试过程中的运行机制与追踪机制,方便测试人员对测试用例进行选用以及对测试过程进行监督。
作为上述实施例中的步骤S203的一种实施方式,如图4所示,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞,之后还包括:
步骤S2031,对漏洞进行管理,将漏洞自动同步到漏洞列表。
在步骤S2031中,对发现漏洞的管理,测试用例执行结果生成漏洞,漏洞自动同步到漏洞列表。整个系统中有漏洞列表,每个测试人员用该系统测试过程中发现漏洞都自动同步在该列表,实现对漏洞的统一管理。为了确保测试出来的漏洞结果能够被及时发送到漏洞管理平台,需要建立测试设备与漏洞管理 平台之间的连接关系,以便实现快速发送漏洞结果的目的。而每个测试人员能够利用测试需求中所含的漏洞管理平台链接使其测试设备与漏洞管理平台之间建立连接,而测试设备与漏洞管理平台之间建立连接的方法可以是测试设备通过在浏览器上运行漏洞管理平台链接以实现与漏洞管理平台相连。
步骤S2032,获取漏洞列表中的漏洞并生成系统漏洞报告,根据系统漏洞报告提供相应改进措施。
步骤S2033,根据针对改进措施所输入代码调整被测系统并重新进行测试,直至漏洞修复。
以软件测试中的对定位的安全威胁函数进行污点传播分析为例,跟踪污点数据的流向,并运用污点传播理论分析与判断该威胁函数是否发生污点传播现象,如果发生,表示该源代码可能存在缓冲区溢出漏洞的安全威胁,需要该漏洞进行改进。
具体的,对于设定源代码中的程序指令i、变量x、污点集合T,有:x∈dstsi,该dstsi为程序指令i的目的操作数集合,src[x]i为对于程序指令i和目的操作数为x的源操作数集合;若满足x∈dstsiΛ(y∈src[x]i|y∈T)≠Φ,即对于程序指令i、源操作数集合非空(即存在源操作数)且属于污点集合T时,就将目的操作数x添加到污点集合T中,这种添加操作被称为污点传播;若满足x∈dstsiΛ(y∈src[x]i|y∈T)=Φ,就将目的操作数x从污点集合T中移除,对该漏洞进行改进后再重新进行测试。
本步骤中,软件相应改进措施,软件修改完毕,重新对其进行测试,并将测试结果反馈给漏洞管理平台,直至相关系统缺陷修复成功。
作为上述实施例中的步骤S203的另一种实施方式,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪对所述目标测试用例进行优化,包括:
根据测试用例测所覆盖的事件和事件路径,剔除重复和/或冗余的测试用例。
重复测试用例是指两个测试用例所经过的控件和控件顺序完全相同,显然,从覆盖的角度,这两个测试用例是重复的,任意剔除一个即可。冗余测试用例是指两个测试用例存在包含关系,即一个测试用例包含另外一个测试用例。为了得到个数最少的测试用例集,本申请中采用基于有向路径的贪心算法对测试用例进行优化。这个算法与普通贪心算法的最大区别是引入了有向图的概念,即,该算法处理的是有向图,而普通贪心算法处理的是孤立的点或边。算法主要包括以下步骤:
(1)根据记录的每个测试用例所经过的路径,建立测试用例与路径之间的对应关系;
(2)选取最长路径对应的测试用例删除不需要的测试用例;
(3)根据路径之间的包含关系,剔除最长路径中子路径对应的测试用例;
(4)选取次长路径对应的测试用例,重复(3),直至测试用例优化完毕。
对于上述实施例中的步骤S20之前,作为一种实施方式包括:
设置用户权限,把待测项目的分配、测试用例的导入及开发、删除和测试用测的执行的权限配备给不同的角色,用户账号通过配备不同的角色拥有不同的权限。
本实施例,把测试用例的导入、增加、编写、删除和执行权限配置给不同的角色,进而分配给账号,实现权责清晰,方便测试人员进行追踪。
本申请另一种实施例提供一种基金系统自动化测试管理装置40,如图5所示,基金系统自动化测试管理装置40包括:
数据库创建模块401,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
测试用例推荐模块402,用于根据输入信息对测试用例数据库进行筛选,遍历测试用例数据库后推荐测试用例,
测试用例执行模块403,根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行目标测试用例过程中对目标测试用例进行跟踪;
测试报表生成模块404,用于执行完目标测试用例后自动生成测试报表。
进一步的,如图6所示,作为一种实施方式,数据库创建模块401包括:
分组单元410,用于将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据测试用例及其脚本的语言类别在测试用例数据库中形成不同的分组;
调用单元420,用于根据检索指令检索包括特定功能的测试用例,调用测试用例中与特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在相同语言的分组内。
进一步的,作为一种实施方式,测试用例推荐模块402还用于根据输入的功能信息查找所述测试用例数据库中的测试用例,按照所述测试用例的运行次数对所述测试用例进行排序,并将完成排序的测试用例形成推荐测试用例,根据用户的选择指令获取目标测试用例。
测试用例执行模块403还用于对接待测试项目,当所述待测试项目要执行一个以上的目标测试用例时,对所述目标测试用例迭代执行,并在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞以及对所述目标测试用例进行优化。
上述终端设备中模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本申请另一种实施例提供一计算机可读存储介质,该计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时实现上述实施例中的基金系统自动化测试管理方法,为避免重复,这里不再赘述。或者,该计算机可读指令被处理器执行时实现上述实施例中基金系统自动化测试管理装置中各模块/单元的功能,为避免重复,这里不再赘述。
图7是本实施例中终端设备的示意图。如图7所示,终端设备6包括处理器60、存储器61以及存储在存储器61中并可在处理器60上运行的计算机可读指令62。处理器60执行计算机可读指令62时实现上述实施例中基金系统自动化测试管理方法的各个步骤,例如图1所示的步骤S10、S20和S30。或者,处理器60执行计算机可读指令62时实现上述实施例中基金系统自动化测试管理装置各模块/单元的功能。
示例性的,计算机可读指令62可以被分割成一个或多个模块/单元,一个或者多个模块/单元被存储在存储器61中,并由处理器60执行,以完成本申请。一个或多个模块/单元可以是能够完成特定功能的一系列计算机可读指令段,该指令段用于描述计算机可读指令62在终端设备6中的执行过程。例如,计算机可读指令62可以被分割成如图5所示的数据库创建模块401、测试用例推荐模块402、测试用例执行模块403以及测试报表生成模块404的功能。
该终端设备6可以是桌上型计算机、笔记本、掌上电脑及云端服务器等计算设备。终端设备可包括,但不仅限于,处理器60、存储器61。本领域技术人员可以理解,图7仅仅是终端设备6的示例,并不构成对终端设备6的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如终端设备还可以包括输入输出设备、网络接入设备、总线等。
所称处理器60可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器61可以是终端设备6的内部存储单元,例如终端设备6的硬盘或内存。存储器61也可以是终端设备6的外部存储设备,例如终端设备6上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital, SD)卡,闪存卡(Flash Card)等。进一步地,存储器61还可以既包括终端设备6的内部存储单元也包括外部存储设备。存储器61用于存储计算机可读指令以及终端设备所需的其他程序和数据。存储器61还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的模块/单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机可读指令来指令相关的硬件来完成,所述的计算机可读指令可存储于一计算机可读存储介质中,该计算机可读指令在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机可读指令包括计算机可读指令代码,所述计算机可读指令代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质可以包括:能够携带所述计算机可读指令代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。需要说明的是,所述计算机可读介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读介质不包括是电载波信号和电信信号。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种基金系统自动化测试管理方法,其特征在于,所述基金系统自动化测试管理方法包括:
    基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
    根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
    根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
    执行完成所述目标测试用例后自动生成测试报表。
  2. 如权利要求1所述的基金系统自动化测试管理方法,其特征在于,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库,包括:
    将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据所述测试用例及其脚本的语言类别在测试用例数据库中形成不同的分组;
    根据检索指令检索包括特定功能的测试用例,调用所述测试用例中与所述特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在与所述新录入的代码相同语言的分组内。
  3. 如权利要求1所述的基金系统自动化测试管理方法,其特征在于,根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例,根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪,包括:
    根据输入的功能信息查找所述测试用例数据库中的测试用例,按照所述测试用例的运行次数对所述测试用例进行排序,并将完成排序的测试用例形成推荐测试用例,根据用户的选择指令获取目标测试用例;
    对接待测试项目,当所述待测试项目要执行一个以上的目标测试用例时, 对所述目标测试用例迭代执行;
    在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞以及对所述目标测试用例进行优化。
  4. 如权利要求3所述的基金系统自动化测试管理方法,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞,之后还包括:
    对所述被测系统的漏洞进行管理,将所述被测系统的漏洞自动同步到漏洞列表;
    获取所述漏洞列表中的漏洞并生成系统漏洞报告,根据所述系统漏洞报告提供相应改进措施;
    根据针对所述改进措施所输入代码调整被测系统并重新进行测试,直至漏洞修复。
  5. 如权利要求3所述的基金系统自动化测试管理方法,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪对所述目标测试用例进行优化,包括:
    根据测试用例测所覆盖的事件和事件路径,剔除重复和/或冗余的测试用例。
  6. 如权利要求1所述的基金系统自动化测试管理方法,其特征在于,所述自动化测试管理方法还包括:
    设置用户账号权限,将待测项目的分配、测试用例的导入、测试用例的开发、测试用例的删除以及测试用测的执行的权限配备给不同的用户账号。
  7. 一种基金系统自动化测试管理装置,其特征在于,所述基金系统自动化测试管理装置包括:
    数据库创建模块,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
    测试用例推荐模块,用于根据输入信息对所述测试用例数据库进行筛选, 遍历所述测试用例数据库后推荐测试用例;
    测试用例执行模块,用于根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
    测试报表生成模块,用于执行完成所述目标测试用例后自动生成测试报表。
  8. 如权利要求7所述的基金系统自动化测试管理装置,其特征在于,所述数据库创建模块包括:
    分组单元,用于将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据所述测试用例及其脚本的语言类别在测试用例数据库中形成不同的分组;
    调用单元,用于根据检索指令检索包括特定功能的测试用例,调用所述测试用例中与所述特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在相同语言的分组内。
  9. 一种终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,其特征在于,所述处理器执行所述计算机可读指令时实现如下步骤:
    基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
    根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
    根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
    执行完成所述目标测试用例后自动生成测试报表。
  10. 如权利要求9所述的终端设备,其特征在于,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库,包括:
    将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据所述测试用例及其脚本的语言类别在测试用例数据库 中形成不同的分组;
    根据检索指令检索包括特定功能的测试用例,调用所述测试用例中与所述特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在与所述新录入的代码相同语言的分组内。
  11. 如权利要求9所述的终端设备,其特征在于,根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例,根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪,包括:
    根据输入的功能信息查找所述测试用例数据库中的测试用例,按照所述测试用例的运行次数对所述测试用例进行排序,并将完成排序的测试用例形成推荐测试用例,根据用户的选择指令获取目标测试用例;
    对接待测试项目,当所述待测试项目要执行一个以上的目标测试用例时,对所述目标测试用例迭代执行;
    在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞以及对所述目标测试用例进行优化。
  12. 如权利要求11所述的终端设备,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞之后,所述处理器执行所述计算机可读指令时还实现如下步骤:
    对所述被测系统的漏洞进行管理,将所述被测系统的漏洞自动同步到漏洞列表;
    获取所述漏洞列表中的漏洞并生成系统漏洞报告,根据所述系统漏洞报告提供相应改进措施;
    根据针对所述改进措施所输入代码调整被测系统并重新进行测试,直至漏洞修复。
  13. 如权利要求11所述的终端设备,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪对所述目标测试用例进行优化,包括:
    根据测试用例测所覆盖的事件和事件路径,剔除重复和/或冗余的测试用例。
  14. 如权利要求9所述的终端设备,其特征在于,所述处理器执行所述计算机可读指令时还实现如下步骤:
    设置用户账号权限,将待测项目的分配、测试用例的导入、测试用例的开发、测试用例的删除以及测试用测的执行的权限配备给不同的用户账号。
  15. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可读指令,其特征在于,所述计算机可读指令被处理器执行时实现如下步骤:
    基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库;
    根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例;
    根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测试用例并对所述目标测试用例进行跟踪;
    执行完成所述目标测试用例后自动生成测试报表。
  16. 如权利要求15所述的计算机可读存储介质,其特征在于,基于新录入的代码以及多个测试系统中存储的测试用例创建测试用例数据库,包括:
    将已存在的测试用例及其脚本导入测试用例数据库,识别导入的测试用例及其脚本的语言,并根据所述测试用例及其脚本的语言类别在测试用例数据库中形成不同的分组;
    根据检索指令检索包括特定功能的测试用例,调用所述测试用例中与所述特定功能对应的部分代码,并与新录入的代码合并形成新的测试用例并储存在与所述新录入的代码相同语言的分组内。
  17. 如权利要求15所述的计算机可读存储介质,其特征在于,根据输入信息对所述测试用例数据库进行筛选,遍历所述测试用例数据库后推荐测试用例,根据用户的选择指令从推荐的测试用例中获取目标测试用例,执行所述目标测 试用例并对所述目标测试用例进行跟踪,包括:
    根据输入的功能信息查找所述测试用例数据库中的测试用例,按照所述测试用例的运行次数对所述测试用例进行排序,并将完成排序的测试用例形成推荐测试用例,根据用户的选择指令获取目标测试用例;
    对接待测试项目,当所述待测试项目要执行一个以上的目标测试用例时,对所述目标测试用例迭代执行;
    在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞以及对所述目标测试用例进行优化。
  18. 如权利要求17所述的计算机可读存储介质,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪确定被测系统的漏洞之后,所述处理器执行所述计算机可读指令时还实现如下步骤:
    对所述被测系统的漏洞进行管理,将所述被测系统的漏洞自动同步到漏洞列表;
    获取所述漏洞列表中的漏洞并生成系统漏洞报告,根据所述系统漏洞报告提供相应改进措施;
    根据针对所述改进措施所输入代码调整被测系统并重新进行测试,直至漏洞修复。
  19. 如权利要求17所述的计算机可读存储介质,其特征在于,在测试过程中对所述目标测试用例进行跟踪,并通过跟踪对所述目标测试用例进行优化,包括:
    根据测试用例测所覆盖的事件和事件路径,剔除重复和/或冗余的测试用例。
  20. 如权利要求15所述的计算机可读存储介质,其特征在于,所述处理器执行所述计算机可读指令时还实现如下步骤:
    设置用户账号权限,将待测项目的分配、测试用例的导入、测试用例的开发、测试用例的删除以及测试用测的执行的权限配备给不同的用户账号。
PCT/CN2017/112221 2017-10-31 2017-11-22 基金系统自动化测试管理方法、装置、设备及存储介质 WO2019085061A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711045922.4A CN107885660B (zh) 2017-10-31 2017-10-31 基金系统自动化测试管理方法、装置、设备及存储介质
CN201711045922.4 2017-10-31

Publications (1)

Publication Number Publication Date
WO2019085061A1 true WO2019085061A1 (zh) 2019-05-09

Family

ID=61783153

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/112221 WO2019085061A1 (zh) 2017-10-31 2017-11-22 基金系统自动化测试管理方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN107885660B (zh)
WO (1) WO2019085061A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110750435A (zh) * 2018-07-23 2020-02-04 北京奇虎科技有限公司 一种测试用例的管理方法和装置
CN109324960A (zh) * 2018-08-13 2019-02-12 中国平安人寿保险股份有限公司 基于大数据分析的自动测试方法及终端设备
CN109614312A (zh) * 2018-10-23 2019-04-12 中国平安人寿保险股份有限公司 测试用例生成方法、装置、电子设备及存储介质
CN109815128A (zh) * 2018-12-21 2019-05-28 北京奇艺世纪科技有限公司 测试用例的执行方法、装置、终端及计算机可读存储介质
CN110209702B (zh) * 2019-06-06 2021-05-04 齐鲁工业大学 开关磁阻电机功率拓扑推荐方法、系统、终端及存储介质
CN110851308A (zh) * 2019-10-21 2020-02-28 香港乐蜜有限公司 一种测试方法、装置、电子设备及存储介质
CN111143226B (zh) * 2019-12-31 2023-06-27 医渡云(北京)技术有限公司 自动化测试方法及装置、计算机可读存储介质、电子设备
CN111522734B (zh) * 2020-03-17 2023-02-28 上海云砺信息科技有限公司 软件功能测试方法、装置、电子设备及存储介质
CN111723007B (zh) * 2020-05-29 2022-06-14 苏州浪潮智能科技有限公司 一种测试用例的合并方法、系统、设备以及介质
CN111813691B (zh) * 2020-07-23 2024-03-01 中国工商银行股份有限公司 测试问题排查方法、装置、电子设备和介质
CN112306880A (zh) * 2020-11-02 2021-02-02 百度在线网络技术(北京)有限公司 测试方法、装置、电子设备和计算机可读存储介质
CN113704103B (zh) * 2021-08-24 2023-08-04 网易(杭州)网络有限公司 测试用例推荐方法、装置、介质及电子设备
CN117573566B (zh) * 2024-01-16 2024-04-12 麒麟软件有限公司 多系统测试用例生成方法、装置及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870384A (zh) * 2012-12-11 2014-06-18 航天信息股份有限公司 测试用例推定方法及系统
CN106326116A (zh) * 2016-08-17 2017-01-11 北京奇虎科技有限公司 产品测试的方法和装置
CN106909510A (zh) * 2017-03-02 2017-06-30 腾讯科技(深圳)有限公司 一种获取测试用例的方法以及服务器
CN107145438A (zh) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 代码测试方法、代码测试装置及代码测试系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855187B (zh) * 2012-07-31 2015-10-28 许继电气股份有限公司 一种继电保护装置软件缺陷跟踪管理系统及方法
CN103617118B (zh) * 2013-11-28 2016-06-29 北京奇虎科技有限公司 测试结果的统一处理方法、装置及系统
CN105095059B (zh) * 2014-04-15 2019-06-11 阿里巴巴集团控股有限公司 一种自动化测试的方法和装置
CN104484267B (zh) * 2014-11-20 2018-05-01 大唐移动通信设备有限公司 一种测试系统及方法
CN104699616B (zh) * 2015-03-31 2016-12-07 北京奇虎科技有限公司 一种应用测试的方法、装置及系统
US10176426B2 (en) * 2015-07-07 2019-01-08 International Business Machines Corporation Predictive model scoring to optimize test case order in real time
CN105117333A (zh) * 2015-08-24 2015-12-02 深圳市高斯贝尔家居智能电子有限公司 一种测试用例的管理方法及系统
CN106326122B (zh) * 2016-08-23 2018-08-31 北京精密机电控制设备研究所 一种软件单元测试用例管理系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870384A (zh) * 2012-12-11 2014-06-18 航天信息股份有限公司 测试用例推定方法及系统
CN107145438A (zh) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 代码测试方法、代码测试装置及代码测试系统
CN106326116A (zh) * 2016-08-17 2017-01-11 北京奇虎科技有限公司 产品测试的方法和装置
CN106909510A (zh) * 2017-03-02 2017-06-30 腾讯科技(深圳)有限公司 一种获取测试用例的方法以及服务器

Also Published As

Publication number Publication date
CN107885660B (zh) 2020-04-03
CN107885660A (zh) 2018-04-06

Similar Documents

Publication Publication Date Title
WO2019085061A1 (zh) 基金系统自动化测试管理方法、装置、设备及存储介质
Theisen et al. Approximating attack surfaces with stack traces
US11163731B1 (en) Autobuild log anomaly detection methods and systems
US9965255B2 (en) Code origination data management for code assembly
US9558230B2 (en) Data quality assessment
US8140578B2 (en) Multilevel hierarchical associations between entities in a knowledge system
US20190146901A1 (en) Cognitive manufacturing systems test repair action
Kirbas et al. The relationship between evolutionary coupling and defects in large industrial software
US11294802B2 (en) Identifying incorrect variable values in software testing and development environments
CN113326247B (zh) 云端数据的迁移方法、装置及电子设备
Karim et al. Mining android apps to recommend permissions
US9658948B2 (en) Workload mapper for potential problem areas using modules and defect data
US9842044B2 (en) Commit sensitive tests
US20120173498A1 (en) Verifying Correctness of a Database System
WO2023177442A1 (en) Data traffic characterization prioritization
US10394793B1 (en) Method and system for governed replay for compliance applications
US11880470B2 (en) System and method for vulnerability detection in computer code
Shapira et al. Common pitfalls of benchmarking big data systems
US10303579B2 (en) Debug session analysis for related work item discovery
US10572669B2 (en) Checking for unnecessary privileges with entry point finder
Fang et al. Test Report Generation for Android App Testing Via Heterogeneous Data Analysis
US20190050574A1 (en) Automatic impact detection after patch implementation with entry point finder
US11537503B2 (en) Code editor for user interface component testing
Yu Review of Panorama and Key Technical Prospect on Software Testing
Tu et al. FEFuzzer: Hybrid Files Fuzzing Tool

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17930938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29.09.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17930938

Country of ref document: EP

Kind code of ref document: A1