CN113835999A - Workflow-based testing method for distributed heterogeneous processing system - Google Patents

Workflow-based testing method for distributed heterogeneous processing system Download PDF

Info

Publication number
CN113835999A
CN113835999A CN202110878589.5A CN202110878589A CN113835999A CN 113835999 A CN113835999 A CN 113835999A CN 202110878589 A CN202110878589 A CN 202110878589A CN 113835999 A CN113835999 A CN 113835999A
Authority
CN
China
Prior art keywords
test
testing
workflow
data
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110878589.5A
Other languages
Chinese (zh)
Inventor
李登峻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110878589.5A priority Critical patent/CN113835999A/en
Publication of CN113835999A publication Critical patent/CN113835999A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)

Abstract

The invention discloses a testing method of a distributed heterogeneous processing system based on workflow, which is claimed to be protected, by setting up a testing environment, selecting a testing tool, determining the requirement for testing networking data of the networking environment, selecting a testing method to design testing codes and testing cases for system functions, performance, safe reliability, usability, compatibility, expandability and resource occupancy rate of the system and user documents, evaluating the testing result and providing a testing report of the distributed heterogeneous processing system based on workflow. The testing method can detect the current performance level of the system, verify whether the performance can meet the designed application requirements, accurately detect the maximum load which can be borne by the system, analyze the bottleneck of the system, provide optimization reference, pay attention to the state of the system in real time in the testing process, immediately terminate the test when the system can not bear corresponding pressure, effectively ensure that the test can not exceed the maximum bearable pressure of the system, and avoid system breakdown and data damage.

Description

Workflow-based testing method for distributed heterogeneous processing system
Technical Field
The invention relates to a technology in the aspect of product testing, in particular to a testing method of a distributed heterogeneous processing system based on workflow.
Background
In the prior art, in terms of software lifecycle management, a ratio series product of IBM company and a Quality Center of hewlett packard company are taken as representatives, wherein the ratio series product comprises a demand analysis product, a design and construction product, a software Quality assurance product, a software configuration management product, a process and a project management product, and the product series of IBM company is suitable for a team management software of software development, and the process is from the demand analysis, through the design construction, the white box test and the software configuration until the product is formed according to the project; the Quality Center product of hewlett-packard company is a Web-based test management tool that can organize and manage all stages of an application test flow, including specifying test requirements, planning tests, performing tests, and tracking defects.
With the gradual maturity and scale of software industry, the software development mode and the software demand mode are changed significantly, and the special software developed for a specific client trait can be managed in a simple tool form and a tool kit form, but enterprises and large-scale software development enterprises which have large scale, high software quality requirement, and rapid software function upgrade for telecommunication, finance and the like, and enterprises which need to use third-party software testing, the tool kit form and foreign product series such as IBM company and hewlett packard company have the defects that the software testing quality and development capability cannot be evaluated, the purpose of continuously improving the work of a software development team and the work of a software testing team cannot be achieved, and budget management of a software testing project in days as a unit cannot be achieved on the basis of evaluation.
The invention provides a system test scheme of 'intelligent campus platform project' according to relevant documents of intelligent campus platform project in Jiangxi province; the software development recommendation method is provided for adapting to the continuous improvement of software testing quality level of third-party software testing enterprises and providing constructive suggestions for software development according to software testing data; the method is provided for effectively controlling budget in a software testing link. The intelligent campus platform system has the advantages that the platform function of system informatization in building of a learning type intelligent campus platform and a learning type society and the supporting function in teaching and scientific research management are fully exerted, the existing informatization building level of Jiangxi province is comprehensively improved, and the active function of informatization in promoting various work transformation upgrading of the intelligent campus and walking in the front of the national intelligent campus platform system is fully exerted.
Disclosure of Invention
The invention tests the 'Jiangxi intelligent campus platform project' according to a professional software testing method and a professional business angle.
The invention relates to a testing method of a distributed heterogeneous processing system based on a workflow, which is characterized by comprising the following steps:
building a test environment and selecting a test tool;
defining the requirements and test contents for networking data of a test networking environment;
selecting a test method, and designing a test code and a test case;
evaluating a test result, and issuing a test report of the distributed heterogeneous processing system based on the workflow;
the test content comprises system function, performance, safety and reliability, usability, compatibility, expandability, resource occupancy rate and user documents.
Further, the distributed heterogeneous workflow-based processing system includes:
the SOA system architecture is adopted, and is characterized in that the distributed heterogeneous processing system at least comprises a display layer, a supporting platform layer, an application layer, a data layer, a framework layer, a data interface layer and a network layer;
the supporting platform layer at least comprises an application service layer, a basic service layer and a business development layer;
the data interface layer realizes complete opening of all platform management interfaces and application interfaces, reserves interfaces interconnected with the existing service system, can realize a service approval process and comprises a new function module for public development, and at least comprises a web site, an application program and a web script;
the distributed heterogeneous processing system adopts a component management and workflow management mode to realize a graphical definition flow and monitoring function, a front-end task management function and a workflow execution service;
the distributed heterogeneous processing system at least supports a plurality of workflow working modes of sequence, parallel, selection and repeated execution by adopting a workflow engine, realizes flexible configuration of various flows, can graphically maintain workflow control data and workflow related data, realizes monitoring and management of various workflows and meets the special processing requirements of files;
the data interface layer is built based on a current mainstream server and is used for installing and deploying a mainstream operating system, a database and middleware;
the data layer includes a database that is designed using a standard SQL database using a database system that supports high concurrency, large data volumes.
Further, the building of the test environment and the selection of the test tool specifically include:
the test group carries out detailed test requirement communication, determines specific test requirements, deploys a test machine, installs test tools and software, and carries out test environment configuration or confirmation work;
the environmental requirements are as follows: an application server: windows Server2008 SP1 Enterprise edition and above;
a database server: ORACLE 11G;
a client: an IE browser;
the user login is carried out by manually modifying the database, and the preparation work of related test data is required to be done;
data import needs to prepare a data source and needs to make preparation work of test data;
non-functional tests, especially performance tests, must be performed after the functional tests are completed;
the test tool is LoadRunner, and the performance test is executed: response speed under a predetermined environment and load, particularly under a large load, a large concurrency;
if other testing tools are needed, the test device can be adjusted and added according to the testing situation.
Further, the determining the requirement for the networking data of the test networking environment and the test content further includes:
the function test means that the functions of the basic application platform are mainly tested according to the requirement specification and the user manual of the system, and the interfaces of the basic application platform and other service systems are tested;
the performance test is mainly used for testing the system according to the requirement specification of the system, a user manual and related agreed main performance indexes, and the condition that the system occupies resources is investigated while the performance test is carried out;
the safety and reliability test mainly inspects the safety of access and access of the application system and the safety of the application software, and simultaneously needs to test the continuous stability of the system in the operation process, including the fault-tolerant capability of the system and the protection capability of data.
Further, the determining the requirement for the networking data of the test networking environment and the test content further includes:
the usability test is to test the aspects of the consistency, the friendliness, the usability and the like of the system interface style from the perspective of an end user;
the compatibility test refers to the compatibility degree of the software product to the relevant test environment;
the expandability test refers to the expansion capability of the system function, and at least comprises the adaptability to the change of user requirements;
the user document inspection emphatically inspects the completeness of the submitted document and the conformity with the actual system.
Further, the selecting a test method, designing a test code and a test case, further includes:
the method for testing the functions comprises an equivalence class division method, a boundary value analysis method and an error speculation method;
the performance test method comprises single-user performance test and concurrent performance test;
the test method of the safe reliability is combined with the functional test, and the test of the data backup and recovery means and the trace function system safety can be completed by combining the functional test;
the test method for user document inspection comprises the following steps: the standardization of the documents is checked by the testing group according to the national standard, the industry convention and the requirements of the owner;
and (4) expert evaluation: the organization related experts check the document and evaluate the content and the organization result of the document;
and (3) testing and verifying: and verifying the document through testing to verify whether the document is consistent with the software.
Further, the selecting a test method, designing a test code and a test case, further includes:
the method can be divided into case design aiming at black box test and case design aiming at white box test according to the test method, the black box test is adopted in the test, and the method applied in the test case design comprises the following steps: the method comprises the following steps of equivalence class division, effective equivalence classes, ineffective equivalence classes, a causal graph, boundary value analysis and an error prediction method.
Further, the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further includes:
the problem severity grade of the test result is divided into three grades, specifically, serious problems, general problems and recommendation problems.
Further, the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further includes:
the test report indicates whether the test passed, substantially passed, or failed.
Further, the workflow management comprises at least: workflow definition service, interface between workflow system and application system, workflow management and monitoring.
The invention discloses a testing method of a distributed heterogeneous processing system based on workflow, which is claimed to be protected, by setting up a testing environment, selecting a testing tool, determining the requirement for testing networking data of the networking environment, selecting a testing method to design testing codes and testing cases for system functions, performance, safe reliability, usability, compatibility, expandability and resource occupancy rate of the system and user documents, evaluating the testing result and providing a testing report of the distributed heterogeneous processing system based on workflow. The testing method can detect the current performance level of the system, verify whether the performance can meet the designed application requirements, accurately detect the maximum load which can be borne by the system, analyze the bottleneck of the system, provide optimization reference, pay attention to the state of the system in real time in the testing process, immediately terminate the test when the system can not bear corresponding pressure, effectively ensure that the test can not exceed the maximum bearable pressure of the system, and avoid system breakdown and data damage.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a workflow diagram of a method for testing a distributed heterogeneous workflow-based processing system according to the present invention;
FIG. 2 is a block diagram of a distributed heterogeneous processing system according to the present invention;
fig. 3a to 3f are test result diagrams of a testing method of a distributed heterogeneous processing system based on workflow according to the present invention.
Detailed Description
The illustrative embodiments of the present application include, but are not limited to, a method of testing a workflow-based distributed heterogeneous processing system.
It will be appreciated that as used herein, the terms "module," "unit" may refer to or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable hardware components that provide the described functionality, or may be part of such hardware components.
It is to be appreciated that in various embodiments of the present application, the processor may be a microprocessor, a digital signal processor, a microcontroller, or the like, and/or any combination thereof. According to another aspect, the processor may be a single-core processor, a multi-core processor, the like, and/or any combination thereof.
It is to be appreciated that a workflow-based distributed heterogeneous processing system provided herein can be implemented on a variety of electronic devices including, but not limited to, a server, a distributed server cluster of multiple servers, a cell phone, a tablet, a laptop, a desktop computer, a wearable device, a head-mounted display, a mobile email device, a portable game console, a portable music player, a reader device, a personal digital assistant, a virtual reality or augmented reality device, a television or other electronic device having one or more processors embedded or coupled therein, and the like.
Referring to fig. 1, the present invention requests a testing method for a distributed heterogeneous processing system based on workflow, which is characterized by comprising:
building a test environment and selecting a test tool;
defining the requirements and test contents for networking data of a test networking environment;
selecting a test method, and designing a test code and a test case;
evaluating a test result, and issuing a test report of the distributed heterogeneous processing system based on the workflow;
the test content comprises system function, performance, safety and reliability, usability, compatibility, expandability, resource occupancy rate and user documents.
Further, with reference to fig. 2, the workflow-based distributed heterogeneous processing system includes:
the SOA system architecture is adopted, and is characterized in that the distributed heterogeneous processing system at least comprises a display layer, a supporting platform layer, an application layer, a data layer, a framework layer, a data interface layer and a network layer;
the supporting platform layer at least comprises an application service layer, a basic service layer and a business development layer;
the data interface layer realizes complete opening of all platform management interfaces and application interfaces, reserves interfaces interconnected with the existing service system, can realize a service approval process and comprises a new function module for public development, and at least comprises a web site, an application program and a web script;
the distributed heterogeneous processing system adopts a component management and workflow management mode to realize a graphical definition flow and monitoring function, a front-end task management function and a workflow execution service;
the distributed heterogeneous processing system at least supports a plurality of workflow working modes of sequence, parallel, selection and repeated execution by adopting a workflow engine, realizes flexible configuration of various flows, can graphically maintain workflow control data and workflow related data, realizes monitoring and management of various workflows and meets the special processing requirements of files;
the data interface layer is built based on a current mainstream server and is used for installing and deploying a mainstream operating system, a database and middleware;
the data layer includes a database that is designed using a standard SQL database using a database system that supports high concurrency, large data volumes.
The supporting platform layer aggregates dispersed and heterogeneous applications and data layers, realizes seamless access and integration of structured data resources, unstructured documents and internet resources, various application system cross databases and cross system platforms through a uniform access entrance, realizes an integrated environment supporting information access, transmission and cooperation, and efficiently develops, integrates, deploys and manages personalized business applications;
the supporting platform layer can also provide a personalized security channel for accessing key business information and a personalized application interface for a specific user according to the difference of the characteristics, the preference and the role of each user, so that the professor staff can browse the correlated data and perform related transaction processing;
the application service layer comprises: the system comprises a single sign-on module, a unified agency management module and a personalized portal module;
the basic service layer comprises: the system comprises a unified user management module, a unified authentication platform, a unified authority management module, a unified resource access control module and a resource access security audit module;
the service development layer comprises a unified workflow engine platform and an intelligent form platform;
the unified authentication platform needs to be integrated with the SSL-VPN.
The distributed heterogeneous processing system can be in butt joint with a short message platform system, through a mobile mas machine interface, the smart campus platform directly writes short message information generated in application into a database sending table of the mas machine, sends the short message through the mas machine, and enables users receiving the short message sent by the system to timely reply to the short message platform when directly replying to the platform short message through a virtual network corresponding to each user.
Further, the building of the test environment and the selection of the test tool specifically include:
the test group carries out detailed test requirement communication, determines specific test requirements, deploys a test machine, installs test tools and software, and carries out test environment configuration or confirmation work;
the environmental requirements are as follows: an application server: windows Server2008 SP1 Enterprise edition and above;
a database server: ORACLE 11G;
a client: an IE browser;
the user login is carried out by manually modifying the database, and the preparation work of related test data is required to be done;
data import needs to prepare a data source and needs to make preparation work of test data;
non-functional tests, especially performance tests, must be performed after the functional tests are completed;
the test tool is LoadRunner, and the performance test is executed: response speed under a predetermined environment and load, particularly under a large load, a large concurrency;
if other testing tools are needed, the test device can be adjusted and added according to the testing situation.
Further, the determining the requirement for the networking data of the test networking environment and the test content further includes:
the function test means that the functions of the basic application platform are mainly tested according to the requirement specification and the user manual of the system, and the interfaces of the basic application platform and other service systems are tested;
the performance test is mainly used for testing the system according to the requirement specification of the system, a user manual and related agreed main performance indexes, and the condition that the system occupies resources is investigated while the performance test is carried out;
the performance test aiming at the system mainly comprises an execution efficiency test and an equipment efficiency test.
The execution efficiency test mainly tests the number of user concurrencies and response time that the system can endure under the business logic, user interface and functions of a specific application. The performance test case is determined by selecting some typical business operations in the system, and the business operations are simulated to be simultaneously and circularly performed by multiple users through some pressure test tools, so that the server transaction processing average response time, the server 90% transaction processing average response time, the transaction processing speed and other related parameters are tested, and the performance of the system under various conditions is inspected.
The execution efficiency test needs to be combined with the functional test to formulate a corresponding test case, and the test case comprises the following contents:
testing the script: selecting a functional module influencing the execution efficiency of the system, and making a typical service script as a basic case for executing the efficiency test;
number of concurrent users: the method comprises the following steps of (1) dividing the load test and the pressure test, wherein the load test refers to the number of normal user concurrence which can be borne by a system; the stress test refers to the maximum user concurrency number which can be borne by the system;
the concurrent mode comprises the following steps: the method mainly refers to the degree of similarity with the real situation in the user concurrency process, and comprises the mode of changing the number of concurrent users, the number of real clients used in the concurrency process, the simulation of a concurrency peak value and the like.
The device efficiency mainly refers to system CPU occupancy rate, memory occupancy rate, disk occupancy rate, input and output efficiency and the like, and comprises the occupancy condition of hardware resources by software in a non-operating state and the occupancy condition of the hardware resources in a service processing process.
And (4) the concurrency pressure test scene requirement reaches 50 users, and under the condition that the service index and the concurrency user meet the system requirement, the maximum number of the concurrent users of the system is continuously searched until the system requirement is not met.
Table 1 system performance test partial scenario schematic
Figure RE-GDA0003333480690000081
The service performance index comprises a transaction result (Load Test Summary): transaction execution result reporting; response Time (Response Time): processing time of each application request by the server, unit: the second index reflects the performance of system transaction processing, and specifically comprises the following parameters: minium: minimum server response time; average: average server response time; maximum: maximum server response time; std: the larger the value of the deviation of the response of the transaction processing server is, the larger the deviation is; 90%: server response time for 90% of transactions. Number of Virtual concurrent Users (Total Virtual Users): the number of user concurrencies simulated by the testing tool; transaction Rate (Transaction Rate): number of successfully completed transactions per minute at different loads.
The resource monitoring indexes comprise an operating system, a middleware Weblogic resource use index, a middleware IIS resource use index and a database.
TABLE 2 operating System (Windows) resource monitoring indicators
Measurement of Description of the invention
Average load Average number of passes in last minute while in "in place
CPU utilization Percentage of CPU usage time
Disk rate Disk transfer rate
Interrupt rate Number of device interrupts per second
Page-in rate Number of pages per second read into physical memory
Page-out rate Pages per second written to page file and deleted from physical memory
Paging rate Number of pages per second read into physical memory or written into page file
System mode CPU utilization Percentage of time CPU is used in system mode
User mode CPU utilization Percentage of time that CPU is used in user mode
Disk rate Disk transfer rate
Incoming packets rate Number of Ethernet packets per second
Outgoing packets rate Number of Ethernet packets per second
TABLE 3 middleware Weblogic resource usage index
Figure RE-GDA0003333480690000091
Figure RE-GDA0003333480690000101
TABLE 4 middleware IIS resource usage index
Measurement of Description of the invention
#Busy Servers Number of servers in busy state
#Idle Servers Number of servers in idle state
IIS CPU usage Percentage of time that IIS server utilizes CPU
Hits/sec HTTP request rate
Kbytes Sent/sec Rate of sending data bytes from a Web server
TABLE 5 database (Oracle) Performance indicators
Figure RE-GDA0003333480690000102
Figure RE-GDA0003333480690000111
The safety and reliability test mainly inspects the safety of access and access of the application system and the safety of the application software, and simultaneously needs to test the continuous stability of the system in the operation process, including the fault-tolerant capability of the system and the protection capability of data.
The main contents of the safety and reliability test comprise:
and (3) user authority limitation: and examining the limit conditions of the authority of different users.
The function of the left mark: whether the system has an operation log or not, the comprehensiveness and accuracy of the operation condition recorded by the operation log, whether main elements such as an operator, operation date, a use module and the like are included or not.
And (3) shielding user operation errors: and (4) observing prompt and shielding conditions of common misoperation of a user.
Input data validity checking: the system checks the validity of the data entry.
Accuracy of error prompt: and prompting the accuracy degree for the error of the user.
Whether the error caused a system exception exit: and (4) whether the system is abnormally exited due to operation errors or not is determined.
Influence of abnormal conditions: and performing a power failure or network disconnection test in the program running process, inspecting the influence degree of data and the system, and if the data and the system are damaged, providing a remedial tool or not, and determining how to remedy the condition.
Database backup and recovery testing: whether the system provides a data backup and recovery means or not and whether the database backup and recovery can be carried out or not.
And (3) data security testing: storing the key data in a database to be encrypted; the transmission of important data is encrypted; the method adopts practical and reliable measures to prevent malicious attacks and data leakage; the file access requires authority control to prevent malicious downloading.
Further, the determining the requirement for the networking data of the test networking environment and the test content further includes:
the usability test is to test the aspects of the consistency, the friendliness, the usability and the like of the system interface style from the perspective of an end user;
the ease of use test includes ease of installation: the difficulty of installation accords with a popular installation mode;
user interface friendliness: how simple the interface is, the degree of conformity with the business process;
easy learning: compared with the common operators, the learning and using difficulty has certain requirements on the operators;
easy operability: the difficulty of operation, should provide the shortcut to the main or commonly used function;
and (3) consistency of interface styles of all modules: whether the interface style and operation are consistent;
online help enrichment: the accuracy and comprehensiveness of the online help are examined, and the convenience of using the online help in key operation is realized.
The compatibility test refers to the compatibility degree of the software product to the relevant test environment;
the compatibility test comprises the following steps:
hardware compatibility: the compatibility of the system to the test environment hardware.
Platform compatibility: the compatibility of the system to an operating system and a database system.
Software compatibility: and (5) observing the compatibility of the system with other application software, such as antivirus software and the like.
Data compatibility: the data exchange method meets the data standard requirements of the user manual, can conveniently realize data exchange and sharing with other systems on the premise of ensuring the safety of the systems and data, and supports the data formats specified in the user manual.
The expandability test refers to the expansion capability of the system function, and at least comprises the adaptability to the change of user requirements;
the extensibility test comprises:
software expansion capability: the method has the advantages of supporting the parameterized configuration of the business process and the recombination and updating of the business function, and realizing the flexible increase of new business under the condition of not influencing the original business process of the system.
Service scale expansion capability: whether the system can handle more institutions, more users' capabilities.
And (3) system upgrading: whether the system provides a means for upgrading.
System maintenance: whether the system provides an effective means of maintenance.
The user document inspection emphatically inspects the completeness of the submitted document and the conformity with the actual system.
Documents, as an important part of the management of software projects, should be an important piece of content for project acceptance. Technical documents provided by a system integrator and a software developer are consistent with provided software and hardware, and the whole process of system construction, such as system design and development, software and hardware equipment installation, use, maintenance and the like, is comprehensively, completely and detailedly described. The inspection of design documents is a key point for the acceptance of the technical architecture, functional structure and the like of the system.
In the system document acceptance, the document acceptance test is mainly performed on the documents from multiple aspects of basic requirements, completeness, consistency, traceability and comprehensibility of the documents.
Further, the selecting a test method, designing a test code and a test case, further includes:
the method for testing the functions comprises an equivalence class division method, a boundary value analysis method and an error speculation method;
the method for dividing the equivalence classes performs equivalence class division on the business process, the test case is applied to a minimum set of the business main process and the process main branch, all judgment branches can be covered, and the equivalent function test is completed while the process is covered.
The equivalent column division design method is to divide all possible input data, namely an input field of a program into a plurality of parts (subsets), and then select a small amount of representative data from each subset as a test case. An equivalence class refers to a subset of some input domain. In this subset, each input datum is equivalent to revealing an error in the program. And reasonably assume that: testing a representative value of an equivalence class is equivalent to testing other values of that class.
There are two different cases of equivalence class classification: a valid equivalence class and an invalid equivalence class. Both equivalence classes are considered in the design because the software must not only receive reasonable data, but also withstand unexpected tests. Valid equivalence classes: refers to a set of meaningful input data that is reasonable for the specification of the program. The use of valid equivalence classes can verify whether the program has implemented the functions and capabilities specified in the specification. Invalid equivalence classes: as opposed to the definition of a valid equivalence class.
The boundary value analysis method is used for designing and testing boundary values and limit values aiming at input and output domains in a functional specification in a functional test.
Designing a test case by using a boundary value analysis method, firstly: the boundary conditions should be determined. Usually the boundaries of the input and output equivalence classes are the boundary conditions that should be heavily tested. Second, values that are exactly equal to, just above, or just below the boundary should be chosen as test data, rather than typical or arbitrary values in the equivalence class.
The principle of selecting test cases based on the boundary value analysis method is as follows:
1) if the input conditions specify a range of values, the values just up to the boundary of the range and the values just beyond the boundary of the range are taken as the data for the test input.
2) If the input condition specifies a number of values, the maximum number, the minimum number, a number one less than the minimum number, and a number one more than the maximum number are applied as data for the test input.
3) The foregoing rule 1 is used according to each output condition specified by the specification.
4) The foregoing rule 2 is used according to each output condition specified by the specification.
5) If the input field or the output field given by the specification of the program is an ordered set, the first element and the last element of the set should be selected as test case data.
6) If an internal data structure is used in the program, the values on the boundaries of this internal data structure should be selected as test cases.
7) The specification is analyzed to find other possible boundary conditions.
The error speculation method adopts a reverse thinking mode and combines the previous test experience and various errors possibly existing in the function and the flow of the intuitive design software, so that the method for designing the test case pertinently carries out the fault tolerance test. Enumerating all possible errors in the program and special cases which are easy to generate errors, and selecting test cases according to the special cases. For example: the input data and the output data are 0.
The performance test method comprises single-user performance test and concurrent performance test;
the single-user performance test mainly aims at the performance efficiency test of some common operations when the single-user normally operates, and mainly examines how to obtain the real response time of the common operations.
The concurrent performance test is to detect the performance of the system under the real environment, is to test the performance of the system to be tested, and is the most basic content in the performance test.
Obtaining the maximum pressure which can be borne by the system: the system pressure test is to test the change of the system performance by gradually increasing the system load, finally determine the system performance in a failure state under what load condition, and record the maximum number of concurrent users that the system can bear at the moment.
The main purpose of the system pressure test is to detect the maximum load that the system can bear, and in the process of performing the system pressure test, we need to first define the failure state of the system, such as how much the maximum response time that the user can tolerate, the maximum usage limit of the system resources, such as how much the CPU usage limit is, and the like.
The fault diagnosis is a further check aiming at the poor overall performance of the system, when the response time of the system is too long, the problem is generated due to what factors, namely the slow response of the database server or the occurrence of the problem of the application server, if the response of the database server is slow, the problem is directly caused by SQL sentences or the improper configuration of database parameters and the like, and the deep matters belong to the problems to be solved by the fault diagnosis.
The test method of the safe reliability is combined with the functional test, and the test of the data backup and recovery means and the trace function system safety can be completed by combining the functional test;
the test method for user document inspection comprises the following steps: the standardization of the documents is checked by the testing group according to the national standard, the industry convention and the requirements of the owner;
and (4) expert evaluation: the organization related experts check the document and evaluate the content and the organization result of the document;
and (3) testing and verifying: and verifying the document through testing to verify whether the document is consistent with the software.
Further, the selecting a test method, designing a test code and a test case, further includes:
the method can be divided into case design aiming at black box test and case design aiming at white box test according to the test method, the black box test is adopted in the test, and the method applied in the test case design comprises the following steps: the method comprises the following steps of equivalence class division, effective equivalence classes, ineffective equivalence classes, a causal graph, boundary value analysis and an error prediction method.
And (3) equivalent class division: the equivalence class refers to a set of input fields, each input in the set is equivalent to the disclosure of a bug, the input field of a program is divided into a plurality of parts, and then a few representative data are selected from each part to be used as test cases, so that the equivalence class division method is used. It is the basic method of functional testing. There are two different cases of the classification of equivalence classes:
valid equivalence classes: this is a set of meaningful input data that is reasonable for the specification of the program.
Invalid equivalence classes: refers to a set of meaningless input data that is not reasonable for the specification of the program.
When designing a test case, the design of the valid equivalence class and the invalid equivalence class is considered at the same time.
A cause and effect graph: the causal graph is a formal language, translated from a specification written in natural language, which is actually a digital logic graph represented using simplified tokens. The causality graph method is a method that helps one systematically select a set of efficient test cases, and in addition, it can indicate incompleteness and ambiguity in the program specification. The cause and effect graph method ultimately generates the decision table. It is suitable for checking various combinations of program input conditions.
Side value analysis: practice proves that errors are easy to occur near the boundary of an input domain and an output domain of software, and the edge value analysis is a functional test method for selecting test cases by considering boundary conditions. By boundary conditions are meant those state conditions that are directly above and below their boundaries with respect to the input and output equivalence classes. The edge value analysis is an effective complement to the equivalence class division.
Error estimation method: examples of checking for errors that may exist in a program can be generally written with specificity by empirically and intuitively inferring the various errors that may exist in the program. This is the error guessing method. The basic idea of the error inference method is: enumerating all possible errors in the program and special cases which are easy to generate errors, and selecting the test cases according to the errors.
In the case design process, the test items are divided by functions, each function of the system must be covered, and the case level needs to be determined in the case design process.
Use case level definition:
indicating the importance of the use case. The importance of a use case does not correspond to the consequences that a use case may have, but rather to the fundamental degree of a use case, a use case that may cause a crash is not necessarily of a high level, since its triggering conditions may be quite uncommon. The test case is classified into 5 grades:
level 1: basically. And when the test system is used for performing basic functional test on the system, the number of the use cases must be controlled. The system pretest cases, i.e. cases of this level, which are required to be done before the system test, must all pass before the complete system test is performed. The number of level 1 use cases should be controlled.
The method specifically comprises the following steps: basic function, main business process (main case of main path), search result … … without any input of search condition
Such as: basic addition, deletion, modification and printing (all items are input, only necessary items are input and all normal values are obtained); user rights (correct user name with password login, operation within user rights range)
Level 2: it is important.
The method specifically comprises the following steps: main auxiliary test cases relating to business processes, test cases of branch paths, search results (pure search function) for inputting each individual search condition (normal value), user authority (wrong user name or password login) … …
Level 3: detailed description 1
The method specifically comprises the following steps: interface check (length, special symbol, necessary item is empty or blank, etc.), user authority (comparing user name or password login inputted specially), fault tolerance, inputting search result of each individual search condition (normal value) (non-pure search function), search result of each search condition (special value: blank, special symbol, wrong value), page turning (search function) … …
Level 4: details 2
The method specifically comprises the following steps: performance testing (e.g., page response time, database performance, etc.), stress testing (e.g., access volume, multiple data transfers, etc.), limits (large capacity), database transactions, line interrupts (e.g., connections between databases, WebServers, other systems), concurrency, long-lived operation, fault tolerance, cross-platform (e.g., operating systems, IE versions, etc.), security (e.g., user-level access control, page-level security control, etc.), configuration and installation, system data initialization, … …
Level 5: is rarely used. The use cases correspond to rarely used preset conditions and data settings (for example, some conditions need to modify data manually, and the conditions are not easy to occur in normal conditions). Although some test cases find more serious errors, the trigger conditions of those cases are very special and should still be placed in the class 5 case. Tests related to user interface optimization and the like can be classified into 5-level cases (such as operability, page layout, usability, perfection, character errors and the like).
Further, the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further includes: the problem severity grade of the test result is divided into three grades, specifically, serious problems, general problems and recommendation problems.
TABLE 6 Classification criteria for severity of problem
Figure RE-GDA0003333480690000171
Further, the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further includes:
the test report indicates whether the test passed, substantially passed, or failed.
Further, the workflow management comprises at least: workflow definition service, interface between workflow system and application system, workflow management and monitoring.
TABLE 7 test report evaluation Table
Figure RE-GDA0003333480690000172
Figure RE-GDA0003333480690000181
The test results of the system at least comprise test results of a system homepage, a cooperative office system, a educational administration system, a student administration system, a scientific research administration system, a logistics administration system, a researcher administration system, a general service infrastructure administration system and a short message platform.
TABLE 8 demonstration of part of the test results of the student management System
Figure RE-GDA0003333480690000182
Figure RE-GDA0003333480690000191
For the system of the test, the performance test is subjected to scene division, including scene summarization, response time and system resource test.
TABLE 9 overview of the test scenarios for the student management System
Figure RE-GDA0003333480690000192
TABLE 10 student management System test response time
Figure RE-GDA0003333480690000193
Referring to fig. 3a-3f, the service conditions of the hardware resources of the application server when the number of concurrent users of the smart campus platform project in western and jiangxi province include: the application server CPU resource consumption, the application server physical memory resource consumption, the application server network resource consumption, the database server CPU resource consumption, the database server physical memory resource consumption, and the database server network resource consumption are shown schematically.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of the present application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (10)

1. A testing method of a distributed heterogeneous processing system based on workflow is characterized by comprising the following steps:
building a test environment and selecting a test tool;
defining the requirements and test contents for networking data of a test networking environment;
selecting a test method, and designing a test code and a test case;
evaluating a test result, and issuing a test report of the distributed heterogeneous processing system based on the workflow;
the test content comprises system function, performance, safety and reliability, usability, compatibility, expandability, resource occupancy rate and user documents.
2. The method of claim 1, wherein the workflow-based distributed heterogeneous processing system comprises:
the SOA system architecture is adopted, and is characterized in that the distributed heterogeneous processing system at least comprises a display layer, a supporting platform layer, an application layer, a data layer, a framework layer, a data interface layer and a network layer;
the supporting platform layer at least comprises an application service layer, a basic service layer and a business development layer;
the data interface layer realizes complete opening of all platform management interfaces and application interfaces, reserves interfaces interconnected with the existing service system, can realize a service approval process and comprises a new function module for public development, and at least comprises a web site, an application program and a web script;
the distributed heterogeneous processing system adopts a component management and workflow management mode to realize a graphical definition flow and monitoring function, a front-end task management function and a workflow execution service;
the distributed heterogeneous processing system at least supports a plurality of workflow working modes of sequence, parallel, selection and repeated execution by adopting a workflow engine, realizes flexible configuration of various flows, can graphically maintain workflow control data and workflow related data, realizes monitoring and management of various workflows and meets the special processing requirements of files;
the data interface layer is built based on a current mainstream server and is used for installing and deploying a mainstream operating system, a database and middleware;
the data layer includes a database that is designed using a standard SQL database using a database system that supports high concurrency, large data volumes.
3. The method for testing the workflow-based distributed heterogeneous processing system according to claim 1, wherein the building of the test environment and the selection of the test tool specifically include:
the test group carries out detailed test requirement communication, determines specific test requirements, deploys a test machine, installs test tools and software, and carries out test environment configuration or confirmation work;
the environmental requirements are as follows: an application server: windows Server2008 SP1 Enterprise edition and above;
a database server: ORACLE 11G;
a client: an IE browser;
the user login is carried out by manually modifying the database, and the preparation work of related test data is required to be done;
data import needs to prepare a data source and needs to make preparation work of test data;
non-functional tests, especially performance tests, must be performed after the functional tests are completed;
the test tool is LoadRunner, and the performance test is executed: response speed under a predetermined environment and load, particularly under a large load, a large concurrency;
if other testing tools are needed, the test device can be adjusted and added according to the testing situation.
4. The method of claim 1,
the method for determining the requirement and the test content of the networking data of the test networking environment further comprises the following steps:
the function test means that the functions of the basic application platform are mainly tested according to the requirement specification and the user manual of the system, and the interfaces of the basic application platform and other service systems are tested;
the performance test is mainly used for testing the system according to the requirement specification of the system, a user manual and related agreed main performance indexes, and the condition that the system occupies resources is investigated while the performance test is carried out;
the safety and reliability test mainly inspects the safety of access and access of the application system and the safety of the application software, and simultaneously needs to test the continuous stability of the system in the operation process, including the fault-tolerant capability of the system and the protection capability of data.
5. The method of claim 1,
the method for determining the requirement and the test content of the networking data of the test networking environment further comprises the following steps:
the usability test is to test the aspects of the consistency, the friendliness, the usability and the like of the system interface style from the perspective of an end user;
the compatibility test refers to the compatibility degree of the software product to the relevant test environment;
the expandability test refers to the expansion capability of the system function, and at least comprises the adaptability to the change of user requirements;
the user document inspection emphatically inspects the completeness of the submitted document and the conformity with the actual system.
6. The method of claim 1,
the selection test method for designing the test code and the test case further comprises the following steps:
the method for testing the functions comprises an equivalence class division method, a boundary value analysis method and an error speculation method;
the performance test method comprises single-user performance test and concurrent performance test;
the test method of the safe reliability is combined with the functional test, and the test of the data backup and recovery means and the trace function system safety can be completed by combining the functional test;
the test method for user document inspection comprises the following steps: the standardization of the documents is checked by the testing group according to the national standard, the industry convention and the requirements of the owner;
and (4) expert evaluation: the organization related experts check the document and evaluate the content and the organization result of the document;
and (3) testing and verifying: and verifying the document through testing to verify whether the document is consistent with the software.
7. The method of claim 1,
the selection test method for designing the test code and the test case further comprises the following steps:
the method can be divided into case design aiming at black box test and case design aiming at white box test according to the test method, the black box test is adopted in the test, and the method applied in the test case design comprises the following steps: the method comprises the following steps of equivalence class division, effective equivalence classes, ineffective equivalence classes, a causal graph, boundary value analysis and an error prediction method.
8. The method of claim 1,
the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further comprises:
the problem severity grade of the test result is divided into three grades, specifically, serious problems, general problems and recommendation problems.
9. The method of claim 1,
the evaluating the test result and issuing the test report of the distributed heterogeneous processing system based on the workflow further comprises:
the test report indicates whether the test passed, substantially passed, or failed.
10. The method of claim 1,
the workflow management at least comprises: workflow definition service, interface between workflow system and application system, workflow management and monitoring.
CN202110878589.5A 2021-08-02 2021-08-02 Workflow-based testing method for distributed heterogeneous processing system Pending CN113835999A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110878589.5A CN113835999A (en) 2021-08-02 2021-08-02 Workflow-based testing method for distributed heterogeneous processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110878589.5A CN113835999A (en) 2021-08-02 2021-08-02 Workflow-based testing method for distributed heterogeneous processing system

Publications (1)

Publication Number Publication Date
CN113835999A true CN113835999A (en) 2021-12-24

Family

ID=78963125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110878589.5A Pending CN113835999A (en) 2021-08-02 2021-08-02 Workflow-based testing method for distributed heterogeneous processing system

Country Status (1)

Country Link
CN (1) CN113835999A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114691503A (en) * 2022-03-22 2022-07-01 航天中认软件测评科技(北京)有限责任公司 Test-oriented management method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114691503A (en) * 2022-03-22 2022-07-01 航天中认软件测评科技(北京)有限责任公司 Test-oriented management method, device, equipment and medium
CN114691503B (en) * 2022-03-22 2022-09-13 航天中认软件测评科技(北京)有限责任公司 Test-oriented management method, device, equipment and medium

Similar Documents

Publication Publication Date Title
Attariyan et al. Automating configuration troubleshooting with dynamic information flow analysis
Manadhata et al. An attack surface metric
Theisen et al. Approximating attack surfaces with stack traces
US8904319B2 (en) Method and apparatus for merging EDA coverage logs of coverage data
CN111563016B (en) Log collection and analysis method and device, computer system and readable storage medium
Shatnawi Deriving metrics thresholds using log transformation
TW201405306A (en) System and method for automatically generating software test cases
Vieira et al. Resilience benchmarking
CN104657255A (en) Computer-implemented method and system for monitoring information technology systems
US10140403B2 (en) Managing model checks of sequential designs
US11636028B2 (en) Stress test impact isolation and mapping
Liao et al. Using black-box performance models to detect performance regressions under varying workloads: an empirical study
Liu Research of performance test technology for big data applications
Syer et al. Identifying performance deviations in thread pools
US20210286710A1 (en) System testing infrastructure for analyzing soft failures in active environment
CN111966587A (en) Data acquisition method, device and equipment
CN102014163B (en) Cloud storage test method and system based on transaction driving
US9430595B2 (en) Managing model checks of sequential designs
CN113835999A (en) Workflow-based testing method for distributed heterogeneous processing system
US11609842B2 (en) System testing infrastructure for analyzing and preventing soft failure in active environment
US20210286713A1 (en) System testing infrastructure using combinatorics
CN113127445A (en) Domestic substitution migration method of foreign technology application system
Xu et al. Real-Time Diagnosis of Configuration Errors for Software of AI Server Infrastructure
US11656974B2 (en) Enhanced performance diagnosis in a network computing environment
CN111782557B (en) Method and system for testing web application permission

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination