CN113537797A - Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium - Google Patents

Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium Download PDF

Info

Publication number
CN113537797A
CN113537797A CN202110833566.2A CN202110833566A CN113537797A CN 113537797 A CN113537797 A CN 113537797A CN 202110833566 A CN202110833566 A CN 202110833566A CN 113537797 A CN113537797 A CN 113537797A
Authority
CN
China
Prior art keywords
test
project
difficulty
stage
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110833566.2A
Other languages
Chinese (zh)
Inventor
冷炜
吴志刚
高蕊
左志芳
邓辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Citic Bank Corp Ltd
Original Assignee
China Citic Bank Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Citic Bank Corp Ltd filed Critical China Citic Bank Corp Ltd
Priority to CN202110833566.2A priority Critical patent/CN113537797A/en
Publication of CN113537797A publication Critical patent/CN113537797A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06313Resource planning in a project environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063118Staff planning in a project environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a method and a device for intelligently evaluating test workload based on historical data analysis, terminal equipment and a storage medium, and relates to the field of computer systems. The method comprises the following steps: calculating the working productivity reference value of each stage, and selecting different categories of historical project data, including but not limited to: the method comprises the steps of calculating the production rate reference value of each stage in the test work on the basis of different test implementation parties for public and private projects, parameterizing abstract personal experience based on the mode depending on the experience of an evaluator in the prior art, calculating the task production rate of each stage according to a passing test project and a complete test project based on the data analysis of historical projects, and evaluating the influence factors such as the difficulty degree of a tested system, the difficulty degree of a project type, the difficulty degree of the sub-tasks of the work of each stage, the difficulty degree of the tested transaction condition, the data preparation mode and the like.

Description

Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium
Technical Field
The invention relates to the field of computer systems, in particular to a method and a device for intelligently evaluating test workload based on historical data analysis, terminal equipment and a storage medium.
Background
In the current test mode in the industry, a test workload assessment method for each stage and each test implementation party is lacked. A rough analogy method, an experience evaluation method and a detailed test point-based evaluation method are mostly adopted for evaluating the test workload, and the following problems mainly exist:
1. the current test workload evaluation method has higher experience dependence on evaluators and low universality;
2. the current test workload evaluation method can only evaluate the whole workload, but can not evaluate according to a test stage, a test implementation party and the like, and has low instructive performance.
3. The evaluation method based on the detailed test points is time-consuming and low in utilization rate.
Disclosure of Invention
The embodiment of the invention provides a method and a device for evaluating intelligent test workload based on historical data analysis, terminal equipment and a storage medium, wherein the method comprises the following steps:
the intelligent test workload assessment method based on historical data analysis comprises the following steps:
s101, calculating a working productivity reference value of each stage, and selecting different types of historical project data, including but not limited to: respectively calculating the production rate reference value of each stage in the test work on the basis of different test implementation parties for the public project and the private project;
s102, calculating multi-dimensional evaluation influence factor difficulty coefficients of each subtask, combing the influence factors of tasks in each stage of different departments according to historical project data, respectively evaluating the difficulty coefficients in different levels of each influence factor, using each type of project to carry out verification calculation, obtaining a subtask influence factor level coefficient table, then selecting proper levels according to the conditions of the tested projects when a model is used for evaluating the workload, and automatically weighting and calculating the mean value of the measurement coefficients of each subtask by the model so as to obtain the subtask difficulty coefficients in each stage;
s103, calculating a system difficulty coefficient, selecting a system (counter system) with certain difficulty according to the average value of the production rates of the historical projects, setting the difficulty coefficient to be 1, and estimating the difficulty coefficient by other systems and counter systems based on the comparison to finally determine the system difficulty coefficient;
s104, calculating task difficulty coefficients of all stages, wherein the project types are divided into an annual key project, a common scheduling project and an emergency project;
and S105, weighting and calculating the testing workload of the subtasks at each stage of each department, and weighting and calculating the testing workload and the personnel requirements of the subtasks at each stage of each implementation department according to the function sub-transaction quantity of each system of the project and the historical project productivity reference value and by combining the testing types such as the difficulty coefficient, the passing test and the completeness test of each dimension.
Further: the history items in step S102 include, but are not limited to: relating to the quantity of the associated systems, the quality of required documents, the quantity of related departments, the quantity of various functional points and the proficiency of personnel.
Further: the content compared with the difficulty reference coefficient of 1 in step S103 includes but is not limited to: service complexity, transaction flow length, and associated system testing difficulty.
Further: the difference of the work subtask difficulty coefficients corresponding to each item type corresponding to step S104 includes: the method comprises the steps of requirement static test, test analysis, test plan compiling, test scheme compiling, test case reviewing, test planning preparation, test case execution, test report compiling, test report reviewing and test summary.
Further: in step S105, the overall workload of the project is the sum of the testing workloads of each system, the personnel requirement of each system is the maximum personnel requirement number of tasks of each stage, and the requirement of the project tester is the sum of the personnel requirements of each system.
A terminal device, the terminal device comprising: the system comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when the terminal device runs, the processor and the storage medium are communicated through the bus, and the processor executes the machine-readable instructions to execute the steps of the intelligent test workload assessment method based on historical data analysis in the embodiment.
The device comprises a data acquisition module, a data analysis module and a data pushing and displaying module, wherein the data acquisition module is used for different categories of historical project data, the data analysis module is used for analyzing and processing the historical project data and carrying out weighting calculation to obtain a correlation coefficient, and the data pushing and displaying module is used for pushing a template changing tool set to an operator.
In the application, by the method and the device, a mode depending on the experience of an evaluator in the past is abandoned, abstract personal experience is parameterized, task productivity in each stage is calculated according to a passing test type project and a complete test type project based on historical project data analysis, influence factors such as the difficulty degree of a tested system, the difficulty degree of a project type, the difficulty degree of a work subtask in each stage, the test workload and personnel demand of each test implementation party in each test stage are evaluated according to different project types, a workload evaluation tool is developed according to a model, project test data are updated in time after the project is completed, the test productivity is continuously and automatically corrected, and the accuracy of the data is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a flow chart illustrating a method for intelligent test workload assessment based on historical data analysis according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus for intelligent test workload assessment based on historical data analysis according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it should be understood that the drawings in the present invention are for illustrative and descriptive purposes only and are not used to limit the scope of the present invention. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this disclosure illustrate operations implemented according to some embodiments of the present invention. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the direction of this summary, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments of the present invention are only some embodiments of the present invention, and not all embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the term "comprising" will be used in the embodiments of the invention to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features. It should also be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. In the description of the present invention, it should also be noted that the terms "first", "second", "third", and the like are used for distinguishing the description, and are not intended to indicate or imply relative importance.
Referring to FIG. 1, a method of intelligent test workload assessment based on historical data analysis includes the steps of:
s101, calculating a working productivity reference value of each stage, and selecting different types of historical project data, including but not limited to: respectively calculating the production rate reference value of each stage in the test work on the basis of different test implementation parties for the public project and the private project;
s102, calculating multi-dimensional evaluation difficulty coefficients of influence factors of each subtask, combing the influence factors of tasks at each stage of different departments according to historical project data, and evaluating the difficulty coefficients at different levels of each influence factor respectively, wherein the evaluation difficulty coefficients include but are not limited to: the method comprises the following steps of relating to the quantity of a related system, the quality of required documents, the quantity of related departments, the quantity of various functional points, the proficiency of personnel and the like, carrying out verification calculation by using various projects, obtaining a subtask influence factor level coefficient table, then selecting proper levels according to the conditions of tested projects when a model is used for evaluating the workload, and automatically weighting and calculating the mean value of measurement coefficients of various subtasks by the model so as to obtain subtask difficulty coefficients of various stages;
s103, calculating a system difficulty coefficient, selecting a system (counter system) with certain difficulty according to the average value of the production rates of the historical projects, setting the difficulty coefficient to be 1, and estimating the difficulty coefficient based on the comparison between other systems and the counter, wherein the comparison contents include but are not limited to: the system difficulty coefficient is finally determined in aspects of service complexity, transaction flow length, associated system testing difficulty and the like;
s104, calculating task difficulty coefficients of each stage, wherein the project types are divided into an annual key project, a common scheduling project and an emergency project, and the task sub-task difficulty coefficients corresponding to each project type are different and comprise the following steps: the method comprises the following steps of requiring multiple tasks of static testing, testing analysis, compiling a testing plan, compiling a testing scheme, compiling a testing case, evaluating the testing case, preparing testing planning, executing the testing case, compiling a testing report, evaluating the testing report, summarizing the testing and the like;
s105, calculating the testing workload of each stage of subtask of each department in a weighting manner, calculating the testing workload and the personnel demand of each stage of subtask of each implementation department in a weighting manner according to the function sub-transaction quantity of each system of the project and the production rate reference value of the historical project and in combination with the testing types such as the difficulty coefficient, the passing test and the completeness test of each dimension, and reasonably distributing project testing resources; the whole project workload is the sum of the testing workload of each system, the personnel requirement of each system is the maximum personnel requirement quantity of each stage of tasks, and the project testing personnel requirement is the sum of the personnel requirements of each system.
As shown in fig. 2, the terminal device 5 may include: the terminal device comprises a processor 501, a storage medium 502 and a bus 503, wherein the storage medium 502 stores machine-readable instructions executable by the processor 501, when the terminal device runs, the processor 501 and the storage medium 502 communicate through the bus 503, and the processor 501 executes the machine-readable instructions to execute the steps of the method for intelligent test workload assessment based on historical data analysis, which is described in the foregoing embodiments. The specific implementation and technical effects are similar, and are not described herein again.
As shown in fig. 3, the apparatus includes a data acquisition module 401, a data analysis module 402, and a data pushing and displaying module 403, where the data acquisition module 401 is configured to acquire historical project data of different categories, the data analysis module 402 is configured to analyze and process the historical project data and perform weighting calculation to obtain a correlation coefficient, and the data pushing and displaying module 403 is configured to push a template changing tool set to an operator.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the present invention shall be covered thereby. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method for intelligent test workload assessment based on historical data analysis is characterized by comprising the following steps:
s101, calculating a working productivity reference value of each stage, and selecting different types of historical project data, including but not limited to: respectively calculating the production rate reference value of each stage in the test work on the basis of different test implementation parties for the public project and the private project;
s102, calculating multi-dimensional evaluation influence factor difficulty coefficients of each subtask, combing the influence factors of tasks in each stage of different departments according to historical project data, respectively evaluating the difficulty coefficients in different levels of each influence factor, using each type of project to carry out verification calculation, obtaining a subtask influence factor level coefficient table, then selecting proper levels according to the conditions of the tested projects when a model is used for evaluating the workload, and automatically weighting and calculating the mean value of the measurement coefficients of each subtask by the model so as to obtain the subtask difficulty coefficients in each stage;
s103, calculating a system difficulty coefficient, selecting a system (counter system) with certain difficulty according to the average value of the production rates of the historical projects, setting the difficulty coefficient to be 1, and estimating the difficulty coefficient by other systems and counter systems based on the comparison to finally determine the system difficulty coefficient;
s104, calculating task difficulty coefficients of all stages, wherein the project types are divided into an annual key project, a common scheduling project and an emergency project;
and S105, weighting and calculating the testing workload of the subtasks at each stage of each department, and weighting and calculating the testing workload and the personnel requirements of the subtasks at each stage of each implementation department according to the function sub-transaction quantity of each system of the project and the historical project productivity reference value and by combining the testing types such as the difficulty coefficient, the passing test and the completeness test of each dimension.
2. The method of claim 1, wherein the history items in step S102 include but are not limited to: relating to the quantity of the associated systems, the quality of required documents, the quantity of related departments, the quantity of various functional points and the proficiency of personnel.
3. The method of claim 1, wherein the comparison with 1 for the difficulty reference coefficient in step S103 includes but is not limited to: service complexity, transaction flow length, and associated system testing difficulty.
4. The method of claim 1, wherein the difference in the difficulty factors of the work subtasks for each item type corresponding to step S104 comprises: the method comprises the steps of requirement static test, test analysis, test plan compiling, test scheme compiling, test case reviewing, test planning preparation, test case execution, test report compiling, test report reviewing and test summary.
5. The method of claim 1, wherein in step S105, the project global workload is a sum of all system testing workloads, each system personnel requirement is a maximum personnel requirement number of tasks at each stage, and the project tester requirement is a sum of all system personnel requirements.
6. An apparatus for intelligent test workload assessment based on historical data analysis, the apparatus comprising: the device comprises a data acquisition module, a data analysis module and a data pushing and displaying module.
7. The apparatus of claim 6, wherein the data acquisition module is configured to acquire historical item data for different categories.
8. The device of claim 6, wherein the data analysis module is configured to analyze and process the historical item data and perform weighting calculation to obtain a correlation coefficient, and the data pushing and displaying module is configured to push the change template tool set to an operator.
9. A terminal device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the terminal device is operating, the processor executing the machine-readable instructions to perform the steps of the method according to any one of claims 1 to 5.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN202110833566.2A 2021-07-23 2021-07-23 Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium Pending CN113537797A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110833566.2A CN113537797A (en) 2021-07-23 2021-07-23 Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110833566.2A CN113537797A (en) 2021-07-23 2021-07-23 Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113537797A true CN113537797A (en) 2021-10-22

Family

ID=78088738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110833566.2A Pending CN113537797A (en) 2021-07-23 2021-07-23 Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113537797A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863585A (en) * 2022-04-06 2022-08-05 宗申·比亚乔佛山摩托车企业有限公司 Intelligent vehicle test monitoring system and method and cloud platform
CN115081750A (en) * 2022-08-01 2022-09-20 中电金信软件有限公司 Method and device for evaluating workload of performance test project
CN115169808A (en) * 2022-06-08 2022-10-11 中国电力科学研究院有限公司 Method, device and storage medium for calculating charge of digital project in power industry
CN115392804A (en) * 2022-10-28 2022-11-25 四川安洵信息技术有限公司 Talent enabling method and system based on big data
CN115617702A (en) * 2022-12-20 2023-01-17 中化现代农业有限公司 Test working hour prediction method and prediction device
CN115796563A (en) * 2023-02-10 2023-03-14 中建安装集团有限公司 Steel structure list checking and verifying management system based on big data
CN116663860A (en) * 2023-07-27 2023-08-29 深圳昊通技术有限公司 Task allocation method and system for project demands and readable storage medium
CN117689233A (en) * 2024-02-02 2024-03-12 中国电子科技集团公司信息科学研究院 Engineering influence assessment method and system based on demand longitudinal traceability

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863585B (en) * 2022-04-06 2023-08-08 宗申·比亚乔佛山摩托车企业有限公司 Intelligent vehicle testing and monitoring system and method and cloud platform
CN114863585A (en) * 2022-04-06 2022-08-05 宗申·比亚乔佛山摩托车企业有限公司 Intelligent vehicle test monitoring system and method and cloud platform
CN115169808A (en) * 2022-06-08 2022-10-11 中国电力科学研究院有限公司 Method, device and storage medium for calculating charge of digital project in power industry
CN115081750A (en) * 2022-08-01 2022-09-20 中电金信软件有限公司 Method and device for evaluating workload of performance test project
CN115392804A (en) * 2022-10-28 2022-11-25 四川安洵信息技术有限公司 Talent enabling method and system based on big data
CN115392804B (en) * 2022-10-28 2023-02-03 四川安洵信息技术有限公司 Talent enabling method and system based on big data
CN115617702A (en) * 2022-12-20 2023-01-17 中化现代农业有限公司 Test working hour prediction method and prediction device
CN115796563A (en) * 2023-02-10 2023-03-14 中建安装集团有限公司 Steel structure list checking and verifying management system based on big data
CN115796563B (en) * 2023-02-10 2023-04-11 中建安装集团有限公司 Steel structure list checking and verifying management system based on big data
CN116663860A (en) * 2023-07-27 2023-08-29 深圳昊通技术有限公司 Task allocation method and system for project demands and readable storage medium
CN116663860B (en) * 2023-07-27 2024-01-09 深圳昊通技术有限公司 Task allocation method and system for project demands and readable storage medium
CN117689233A (en) * 2024-02-02 2024-03-12 中国电子科技集团公司信息科学研究院 Engineering influence assessment method and system based on demand longitudinal traceability
CN117689233B (en) * 2024-02-02 2024-04-12 中国电子科技集团公司信息科学研究院 Engineering influence assessment method and system based on demand longitudinal traceability

Similar Documents

Publication Publication Date Title
CN113537797A (en) Method and device for intelligent test workload assessment based on historical data analysis, terminal equipment and storage medium
US5655074A (en) Method and system for conducting statistical quality analysis of a complex system
US8751436B2 (en) Analyzing data quality
Aranha et al. An estimation model for test execution effort
US20130159035A1 (en) Consistency Checks For Business Process Data Using Master Data Vectors
CN110633194B (en) Performance evaluation method of hardware resources in specific environment
CN112861492A (en) Method and device for linkage calculation between internal tables of report table and electronic equipment
CN111695877A (en) Computer-implemented project resource management method, system, device and readable medium
Kuan Factors on software effort estimation
CN116523244A (en) Testing manpower risk early warning method based on outsourcing resources
KR20130085062A (en) Risk-management device
CN115170097A (en) Spatial data distributed quality inspection method and system
CN115344495A (en) Data analysis method and device for batch task test, computer equipment and medium
CN111967774B (en) Software quality risk prediction method and device
Kudrjavets et al. Are we speeding up or slowing down? on temporal aspects of code velocity
CN110765600A (en) Method and system for evaluating capability of calculation analysis software of aircraft engine
Jung et al. The quality control of software reliability based on functionality, reliability and usability
Heires What I did last summer: A software development benchmarking case study
JP2006309571A (en) Computer arithmetic processing method and remaining risk determination device
CN117114092B (en) Conduction updating method, system, equipment and medium for oil and gas reserves calculation data
He et al. Software component reliability evaluation method based on characteristic parameters
JP2003345955A (en) System and method for providing problem estimate and improvement process
Grohmann et al. On Learning Parametric Dependencies from Monitoring Data
CN116743565A (en) Performance capacity resource allocation method and device
Hakim et al. Success Factor Analysis of Multiple Project Management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination