WO2010118472A1 - System and method for automated skills assessment - Google Patents

System and method for automated skills assessment Download PDF

Info

Publication number
WO2010118472A1
WO2010118472A1 PCT/AU2010/000425 AU2010000425W WO2010118472A1 WO 2010118472 A1 WO2010118472 A1 WO 2010118472A1 AU 2010000425 W AU2010000425 W AU 2010000425W WO 2010118472 A1 WO2010118472 A1 WO 2010118472A1
Authority
WO
WIPO (PCT)
Prior art keywords
software
tester
defects
assessment
testing
Prior art date
Application number
PCT/AU2010/000425
Other languages
French (fr)
Inventor
Sophie Hiotis
Original Assignee
Stsa Australia Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stsa Australia Pty Ltd filed Critical Stsa Australia Pty Ltd
Publication of WO2010118472A1 publication Critical patent/WO2010118472A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

A system is provided to assess a software tester. The system comprises a central device to store at least one assessment software module containing known defects, a test platform to enable a software tester to test the assessment software module and input identified defects identified by the software tester and communicate said inputs to the central device. The central device is further operable to compare the identified defects to the known defects, and determine from the comparison a measure of the skill of the software tester. A method to assess the testing skills of a software tester and a computer readable medium which when operated by a computer is operable to facilitate assessment of a software tester further described.

Description

"System and method for automated skills assessment"
Cross-Reference to Related Applications
The present application claims priority from United States Provisional Patent Application US 61/170,478 the content of which is incorporated herein by reference.
Technical Field
The present invention relates to software testing, and in particular relates to a system and method for measuring and assessing the skills, efficiency and effectiveness of individuals involved in software testing.
Background of the Invention
In 1996 it was estimated that software testing cost organizations approximately US$23 billion world wide, while during the same period implemented defects cost organizations US$58 billion annually. In 2002 the US National Institute of Standards and Technology estimated that software errors cost the US economy alone US$59.9 billion.
Software testing comprises three fundamental elements: people, processes and tools. Processes have evolved with the implementation of "fit for purpose" methodologies and introduction of new processes and testing tools, such as the end to end testing management tools and automation. The introduction of certification for testers has enabled individuals to be assessed with regards to understanding the testing processes and terminology, however such certification does not assess practical application of testing techniques. Similarly, ISO 9126 is an international standard for the evaluation of software quality but provides no means for evaluating the ability of a software tester. Thus, while certification programs exist to support the professional aspirations of software testers and quality assurance specialists, no certification program actually requires the applicant to demonstrate the ability to test software. Moreover, no certification is based on a widely accepted body of knowledge. Certification itself therefore cannot measure an individual's productivity, their skill, or practical knowledge as a tester.
Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is solely for the purpose of providing a context for the present invention. It is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention as it existed before the priority date of each claim of this application.
Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
Summary of the Invention
According to a first aspect the present invention provides a method for assessing testing skills of a software tester, the method comprising: presenting an assessment software module to the software tester, the assessment software module containing known defects; receiving input from the software tester comprising identified defects identified by the software tester; and comparing the identified defects to the known defects and determining from the comparison a measure of the skill of the software tester.
According to a second aspect the present invention provides a computer readable medium which when operated by a computer is operable to facilitate assessment of a software tester, the computer readable medium comprising: a record of an assessment software module containing known defects; computer program code means for presenting the assessment software module to a software tester for testing; computer program code means for receiving input from the software tester comprising identified defects identified by the software tester; and computer program code means for comparing the identified defects to the known defects, and for determining from the comparison a measure of the skill of the software tester.
According to a third aspect the present invention provides a system for assessment of a software tester, the system comprising: a central device storing at least one assessment software module containing known defects; a test platform enabling a software tester to test the assessment software module and input identified defects identified by the software tester and communicate said inputs to the central device; wherein the central device is further operable to compare the identified defects to the known defects, and determine from the comparison a measure of the skill of the software tester.
According to a fourth aspect the present invention provides a computer program product comprising computer program code means to make a computer execute a procedure for assessing testing skills of a software tester, the computer program product comprising: computer program code means for presenting an assessment software module to the software tester, the assessment software module containing known defects; computer program code means for receiving input from the software tester comprising identified defects identified by the software tester; and computer program code means for comparing the identified defects to the known defects and determining from the comparison a measure of the skill of the software tester.
According to a fifth aspect the present invention provides a method of allocating human testing resources to a software testing project, the method comprising: assessing each software tester by a method in accordance with the first aspect of the invention; and fulfilling resource requirements of the software testing project by reference to the determined measure of the skill of each software tester.
The assessment software module in accordance with any one of the above aspects may be a stand alone application, or may be for use as part of a broader software application. Preferably, a plurality of assessment software modules are prepared and stored ready for use in measuring the skill of a software tester, each assessment software module containing a unique set of known defects, in order to minimise the chance of an assessed software tester forewarning subsequent candidates of the known defects. The plurality of assessment software modules may be derived from a single perfected module by introducing unique defects to produce each assessment software module. Alternatively each assessment software module may be derived from a wholly different perfected software module. Preferably, the known defects comprise both critical defects and non-critical defects. The measure of skill determined for the software tester is preferably more heavily influenced by the ability of the tester to identify critical defects. For example the tester may receive two points for each critical defect identified and one point for each non- critical defect identified. In further preferred embodiments, a time taken for the tester to identify each defect is noted and the measure of skill is increased for more rapid defect identification. That is, the measure of skill may comprise a measure of efficiency and/or accuracy and/or or other testing skill. The measure of accuracy may be calculated according to the combination of correctly identified defects and incorrectly identified defects (or false negatives), with the later reducing the measure of accuracy. Additionally or alternatively a second measure indicative of the efficiency of the tester may be derived from the elapsed time.
The known defects may comprise defects identifiable by any or all of testing methods such as equivalence partitioning, boundary value analysis, all-pairs testing, fuzz testing, model-based testing, traceability matrix testing, exploratory testing, specification-based testing, API testing, code coverage, fault injection methods, mutation testing methods and static testing.
The central device may be further operable to output or display the measure of skill of the software tester. The system may comprise a display device, coupled to said central device, which displays data indicative of the measure of the skill of the software tester.
In accordance with any one of the above aspects, or embodiments thereof, the software tester may be provided with an assessment requirements document with reference to which the assessment software module is to be tested. The assessment requirements document may comprise assessment business requirements and/or assessment technical requirements.
Preferably, any given software tester is again tested at a subsequent time in order to objectively track the skills development of that tester over time.
It is to be appreciated that a "tester" as referred to herein may include a Quality Assurance officer, a Business Analyst, a software developer, a software tester, or any person performing test analysis activities inclusive of execution and defect reporting. Brief Description of the Drawings
An example of the invention will now be described with reference to the accompanying drawings, in which:
Figure 1 schematically illustrates testing of a software tester in accordance with one embodiment of the invention;
Figures 2a and 2b illustrate a schedule which accompanies an assessment software module;
Figure 3 schematically illustrates a table displayed on a user interface into which the software tester is able to indicate identified defects; Figure 4 schematically illustrates testing resource allocation in accordance with the described embodiment of the invention; and
Figure 5 illustrates a general-purpose computing device that may be used in an exemplary system for implementing the invention.
Description of the Preferred Embodiments
Figure 1 schematically illustrates testing of a software tester in accordance with one embodiment of the invention. A plurality of assessment software modules 102 are held ready for testing, each containing known defects. The defects are known to the assessment database 106 but not to the software tester. When testing a software tester, one of the assessment software modules is delivered to a results interface and testing platform 104 for the software tester to undertake testing upon the assessment software module. A schedule (other wise referred to as an assessment business requirements), a portion of which is illustrated in Figures 2a and 2b, accompanies a specified assessment software module. The software tester is assessed as to their ability to identify defects in the application with regards to the requirements itemised in the schedule. Identified defects discovered by the software tester during the test are entered into the results interface and passed to an assessment database 106. Figure 3 schematically illustrates a table displayed on a user interface into which the software tester is able to indicate defects which correspond to items in the itemised schedule. As illustrated, in this hypothetical example, the software tester has indicated that item 8: "The Password field is displayed as 'Password' in the Login screen", item 25: "When the user successfully logs into the STSA application, the Add Customer screen is displayed" and item 35: "Field validation occurs when user selects 'Add Customer'" each contain defects, 305.
The identified defects 305 input by the software tester are compared to the known defects which are stored by the assessment database 106. Based on the comparison of the identified defects to the known defects, a measure of the skill of the software tester is generated in an automated manner by the assessment database. In this example, the software tester has correctly identified items 8, 35 as defects however the software tester has incorrectly identified item 25 as a defect. Subsequently incorrectly identifying item 25 reduces the generated measure of skill for that software tester. The assessment database further compares the results input by the software tester against base line data from other previously tested individuals that share a number of common variables such as years of experience and time taken to complete the test. A report can also be generated at 108 from the input identified defects identified by the tested software tester, and may for example provide an individual ranking within the testing team, the ranking being based on skill set, time taken and numbers of years performing test analysis. Preferably, any given software tester is again tested at a subsequent time in order to objectively track the skills development of that tester over time.
The individual test results of a tester within a single team in turn may be considered together with test results of other software testers within the team in order to report on the testing skill of the team. Comparative reports between teams may also be generated. In turn the results of a plurality of testing teams may be combined in order to report on the testing skill of a whole department or organisation.
The present embodiment of the invention is thus advantageous in enabling the assessment of the practical application of test techniques. Moreover, by objectively testing the practical skill of a tester, the present invention facilitates the identification of future development/training requirements of the individual in order to improve skills which are shown to be lacking by the test results. Still further, the present invention is of value in informing testing coordinators of potential failings in the skill sets of software testers on a testing project, enabling the coordinator to recruit other testers to import such skills and/or otherwise mitigate the risks of a software project in which known skills failings are present. The present embodiment of the invention is further advantageous in permitting largely automated testing of the skill of software testers, thus facilitating testing of a large number of software testers in an efficient, reliable and benchmarked manner.
The present invention recognises that currently there is no standardised measure of the efficiency or accuracy of a testing team or an individual software tester, nor is there any way of determining at an organizational level what return on investment is yielded when allocating a given testing resource to a software testing project. The present invention further recognises that there has previously been no way to identify and mitigate project test risks for teams based on skill.
Notably, the assessment test of this embodiment has been developed within a controlled environment whereby components such as business domain knowledge and technology domain knowledge are excluded or of minimal relevance. Variables such as impacts of test environments and test data are also divorced from the testing assessment of this embodiment. This embodiment thus beneficially provides a test that solely or at least primarily tests the ability of an individual to apply various testing techniques to identify pre-defined defects.
The assessment software modules 102 are developed so as to contain carefully selected known defects, representing each of the testing techniques discussed in the following and thus testing the tester's ability to apply such techniques. Software requirements provided to the tester as part of the assessment process are also deliberately developed with controlled defects for this purpose. The known defects therefore preferably include defects which can only all be identified by performing each of the following techniques:
White Box Testing Basis Path Testing
Flow Control Testing
Conditions Testing
Data Flow Testing
Input validation Testing
Loop Testing (Decision/ Junction)
Black Box Testing Equivalence Partitioning
Boundary Value Analysis
Cause and effect technique
Range Testing
Equivalence Testing
Risk Based Testing
Input validation
Syntax testing The present embodiment further provides for the known defects in the assessment software modules 102 and in the accompanying assessment requirements document to be of pre-defined severity. The software tester's ability to categorise identified defects is therefore tested in this embodiment. To this end an assessment functional risk assessment document will be provided as part of the test. In this embodiment the software tester's ability to identify and categorise critical defects is ranked more highly than the ability to identify and categorise defects of lower importance, in the manner indicated by the following table.
Figure imgf000009_0001
It is noted that in alternative embodiments the software tester may merely be required to identify defects, without categorising their severity. The party carrying out the assessment may in such embodiments simply give the candidate greater credit for identifying defects of greater severity. This embodiment further provides for reports to be automatically generated at 108, and such reports can be customized to meet individual or organization requirements.
Figure 3 schematically illustrates testing resource allocation in accordance with the described embodiment of the invention. This reporting / planning tool of the present embodiment is an application that enables organizations to manage and plan ongoing testing resource allocation based on internally nominated costs, project risk, resource availability and assessed skill set data, the latter being derived in the manner described previously herein. The resource reporting / planning tool provides a testing organization with team based models for the purpose of project and departmental planning of resources performing testing activities. This tool beneficially provides the ability for an organization to select the most effective project test team based on availability, cost and skill set, and further provides the ability for an organization to assess the cost of the testing resource allocation, against the assessed skill set of the teams to be allocated. An example of such resource allocation follows. In this example a given department having a team of eight software testers is required to resource two projects. Using the method of the present invention the skills of each tester, including effectiveness and efficiency, are obtained. Table 1 sets out the measured skills of the eight testers in the team.
Figure imgf000010_0001
Table 1
Table 2 illustrates further relevant information to resource allocation planning.
Figure imgf000010_0002
Table 2
Table 3 sets out characteristics of the two projects requiring resourcing.
Figure imgf000011_0001
Table 3
It is to be appreciated that the skills measures set out in Table 1 enable effective resource allocation to the respective projects. Suitable allocation of resources may lead to a proposal of multiple resourcing options for each project in the manner set out in Table 4.
Figure imgf000011_0002
Table 4
As mentioned in the preceding, an organizational rating system is also provided in this embodiment, which rates the skills of all the testers within the organization, and compares them to a wider set of data held by the database 106. That is, database 106 is operated by a testing assessment body involved in assessing individuals from multiple organisations, and uses the results from previously assessed organisations for comparison to the currently assessed organisation. This organizational rating system of this embodiment of the invention provides the ability for third party organizations to utilize the rating system for commercial purposes. Further, this provides organizations with the ability to quantify and validate return on investment for third party test teams.
The present embodiment of the invention thus provides a means to measure the performance of an individual's application of testing techniques in a controlled environment. This embodiment further enables an individual's performance to be compared against a wide set of measures obtained for a large number of other testers, enabling finer comparisons to be made which account for individual characteristics such as time taken for the test, career experience, education, etc. This embodiment further enables the testing performance of a team and even organisation to be measured, enabling organizations to identify and mitigate project test risks through the application of team models.
The present embodiment thus provides a set of automated processes and integrated tools that enable the practical assessment of the skill set of individuals performing test analysis activities against a baseline obtained from a wide group, for the purposes of identifying individual development needs and determining the testing personnel ROI for organizations.
Some portions of this detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
As such, it will be understood that such acts and operations, which are at times referred to as being computer-executed, include the manipulation by the processing unit of the computer of electrical signals representing data in a structured form. This manipulation transforms the data or maintains it at locations in the memory system of the computer, which reconfigures or otherwise alters the operation of the computer in a manner well understood by those skilled in the art. The data structures where data is maintained are physical locations of the memory that have particular properties defined by the format of the data. However, while the invention is described in the foregoing context, it is not meant to be limiting as those of skill in the art will appreciate that various of the acts and operations described may also be implemented in hardware.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the description, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium includes read only memory ("ROM"); random access memory ("RAM"); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
Turning to Figure 3, the invention is illustrated as being implemented in a suitable computing environment. Although not required, the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by a personal computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multi-processor systems, microprocessor- based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
In Figure 5 a general purpose computing device is shown in the form of a conventional personal computer 20, including a processing unit 21, a system memory 22, and a system bus 23 that couples various system components including the system memory to the processing unit 21. The system bus 23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) 24 and random access memory (RAM) 25. A basic input/output system (BIOS) 26, containing the basic routines that help to transfer information between elements within the personal computer 20, such as during start-up, is stored in ROM 24. The personal computer 20 further includes a hard disk drive 27 for reading from and writing to a hard disk 60, a magnetic disk drive 28 for reading from or writing to a removable magnetic disk 29, and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media.
The hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are connected to the system bus 23 by a hard disk drive interface 32, a magnetic disk drive interface 33, and an optical disk drive interface 34, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 20. Although the exemplary environment shown employs a hard disk 60, a removable magnetic disk 29, and a removable optical disk 31, it will be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories, read only memories, storage area networks, and the like may also be used in the exemplary operating environment.
A number of program modules may be stored on the hard disk 60, magnetic disk 29, optical disk 31, ROM 24 or RAM 25, including an operating system 35, one or more applications programs 36, other program modules 37, and program data 38. A user may enter commands and information into the personal computer 20 through input devices such as a keyboard 40 and a pointing device 42. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 21 through a serial port interface 46 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB) or a network interface card. A monitor 47 or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48. In addition to the monitor, personal computers typically include other peripheral output devices, not shown, such as speakers and printers.
The personal computer 20 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 49. The remote computer 49 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the personal computer 20, although only a memory storage device 50 has been illustrated. The logical connections depicted include a local area network (LAN) 51 and a wide area network (WAN) 52. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and, inter alia, the Internet.
When used in a LAN networking environment, the personal computer 20 is connected to the local network 51 through a network interface or adapter 53. When used in a WAN networking environment, the personal computer 20 typically includes a modem 54 or other means for establishing communications over the WAN 52. The modem 54, which may be internal or external, is connected to the system bus 23 via the serial port interface 46. In a networked environment, program modules depicted relative to the personal computer 20, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the invention as shown in the specific embodiments without departing from the scope of the invention as broadly described. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.

Claims

CLAIMS:
1. A system for assessment of a software tester, the system comprising: a central device storing at least one assessment software module containing known defects; a test platform enabling a software tester to test the assessment software module and input identified defects identified by the software tester and communicate said inputs to the central device; wherein the central device is further operable to compare the identified defects to the known defects, and determine from the comparison a measure of the skill of the software tester.
2. A system according to claim 1 wherein the assessment software module is a stand alone application.
3. A system according to claim 1 wherein the assessment software module is incorporated as a part of a broader software application.
4. A system according to any one of the preceding claims wherein the central device stores a plurality of assessment software modules for use in measuring the skill of a software tester and each assessment software module contains a unique set of known defects.
5. A system according to claim 4 wherein the plurality of assessment software modules are deriveable from a single perfected module by introducing unique defects to produce each assessment software module.
6. A system according to claim 4 wherein each assessment software module is deriveable from a wholly different perfected software module.
7. A system according to any one of the preceding claims wherein the known defects comprise critical defects and non-critical defects.
8. A system according to claim 7, wherein the determined measure of skill for the software tester is more heavily influenced by the ability of said tester to identify critical defects.
9. A method for assessing testing skills of a software tester, the method comprising: presenting an assessment software module to the software tester, the assessment software module containing known defects; receiving input from the software tester comprising identified defects identified by the software tester; and comparing the identified defects to the known defects and determining from the comparison a measure of the skill of the software tester.
10. The method according to claim 9 further comprising producing an assessment module and introducing a unique set of defects into said assessment module.
11. A method according to claim 9 or 10 wherein the known defects comprise critical defects and non-critical defects.
12. A method according to claim 11 further comprising influencing the measure of skill determined for the software tester by the ability of the tester to identify critical defects.
13. A method according to any one of claims 9 to 12 further comprising recording a time taken for the tester to identify each defect and increasing the measure of skill upon a determination of more rapid defect identification.
14. A method according to any one of claims 9 to 13 wherein the measure of skill comprises at least one of a measure of efficiency and a measure of accuracy.
15. A method according to claim 13 or claim 14 when dependent on claim 13 further comprising deriving a second measure indicative of an efficiency of the tester from the recorded time taken.
16. A method according to any one of claims 9 to 15 wherein the known defects comprise defects identifiable by one or more of testing methods including equivalence partitioning, boundary value analysis, all-pairs testing, fuzz testing, model-based testing, traceability matrix testing, exploratory testing, specification-based testing, API testing, code coverage, fault injection methods, mutation testing methods and static testing.
17. A method according to any one of claims 9 to 16 further comprising providing the software tester with an assessment requirements document with reference to which the assessment software module is to be tested.
18. A method according to claim 17 wherein the assessment requirements document comprises assessment business requirements and/or assessment technical requirements.
20. A computer readable medium which when operated by a computer is operable to facilitate assessment of a software tester, the computer readable medium comprising: a record of an assessment software module containing known defects; computer program code means for presenting the assessment software module to a software tester for testing; computer program code means for receiving input from the software tester comprising identified defects identified by the software tester; and computer program code means for comparing the identified defects to the known defects, and for determining from the comparison a measure of the skill of the software tester.
21. A computer program product comprising computer program code means to make a computer execute a procedure for assessing testing skills of a software tester, the computer program product comprising: computer program code means for presenting an assessment software module to the software tester, the assessment software module containing known defects; computer program code means for receiving input from the software tester comprising identified defects identified by the software tester; and computer program code means for comparing the identified defects to the known defects and determining from the comparison a measure of the skill of the software tester.
22. A method of allocating human testing resources to a software testing project, the method comprising: assessing each software tester by a method in accordance with any one of claims
9 to 18; and fulfilling resource requirements of the software testing project by reference to the determined measure of the skill of each software tester.
PCT/AU2010/000425 2009-04-17 2010-04-16 System and method for automated skills assessment WO2010118472A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17047809P 2009-04-17 2009-04-17
US61/170,478 2009-04-17

Publications (1)

Publication Number Publication Date
WO2010118472A1 true WO2010118472A1 (en) 2010-10-21

Family

ID=42982056

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2010/000425 WO2010118472A1 (en) 2009-04-17 2010-04-16 System and method for automated skills assessment

Country Status (1)

Country Link
WO (1) WO2010118472A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017052603A1 (en) * 2015-09-25 2017-03-30 Hewlett Packard Enterprise Development Lp Defect assessment
CN112131108A (en) * 2020-09-18 2020-12-25 电信科学技术第十研究所有限公司 Test strategy adjusting method and device based on characteristic attributes
CN112905475A (en) * 2021-03-11 2021-06-04 湖南化工职业技术学院(湖南工业高级技工学校) Software testing platform based on computer

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001025910A1 (en) * 1999-10-05 2001-04-12 Asynchrony.Com, Inc. System and method for collaborative product development
US20020132656A1 (en) * 2001-01-09 2002-09-19 Michael Lydon Systems and methods for coding competitions
US20030192029A1 (en) * 2002-04-08 2003-10-09 Hughes John M. System and method for software development
US20080092107A1 (en) * 2006-09-27 2008-04-17 Mcwilliam Joshua Software Development and Sales Life-Cycle Services

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001025910A1 (en) * 1999-10-05 2001-04-12 Asynchrony.Com, Inc. System and method for collaborative product development
US20020132656A1 (en) * 2001-01-09 2002-09-19 Michael Lydon Systems and methods for coding competitions
US20030192029A1 (en) * 2002-04-08 2003-10-09 Hughes John M. System and method for software development
US20080092107A1 (en) * 2006-09-27 2008-04-17 Mcwilliam Joshua Software Development and Sales Life-Cycle Services

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017052603A1 (en) * 2015-09-25 2017-03-30 Hewlett Packard Enterprise Development Lp Defect assessment
CN112131108A (en) * 2020-09-18 2020-12-25 电信科学技术第十研究所有限公司 Test strategy adjusting method and device based on characteristic attributes
CN112131108B (en) * 2020-09-18 2024-04-02 电信科学技术第十研究所有限公司 Feature attribute-based test strategy adjustment method and device
CN112905475A (en) * 2021-03-11 2021-06-04 湖南化工职业技术学院(湖南工业高级技工学校) Software testing platform based on computer
CN112905475B (en) * 2021-03-11 2022-09-06 湖南化工职业技术学院(湖南工业高级技工学校) Software testing platform based on computer

Similar Documents

Publication Publication Date Title
Guo et al. " Not my bug!" and other reasons for software bug report reassignments
US9710257B2 (en) System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
Kumaresh et al. Defect analysis and prevention for software process quality improvement
Song et al. Marketing and technology resource complementarity: An analysis of their interaction effect in two environmental contexts
Jiang et al. Information system success as impacted by risks and development strategies
De Gea et al. Requirements engineering tools: Capabilities, survey and assessment
Besker et al. A systematic literature review and a unified model of ATD
Williams et al. Toward a framework for evaluating extreme programming
US20090138843A1 (en) System and method for evaluating software sustainability
Tian Quality-evaluation models and measurements
Bedard et al. Evidential planning in auditing: A review of the empirical research
US20120084220A1 (en) Product certification system and method
Aini et al. Analysis of critical success factors on ERP implementation in PT. Toyota Astra Motor using extended information system success model
WO2010118472A1 (en) System and method for automated skills assessment
Bäckström Industrial Surveys on Software Testing Practices: A Literature Review
Didraga et al. CHARACTERISTICS OF EFFECTIVE IT PROJECT RISK MANAGEMENT IN ROMANIAN IT COMPANIES.
Jarvinen Measurement based continuous assessment of software engineering processes
Nelson et al. Measuring the effectiveness of a structured methodology: a comparative analysis
Silva et al. Effective use of test types for software development
Foulds et al. Structural equation modelling of large-scale information system application development productivity: the Hong Kong experience
Wu et al. Investigating the relationship between IS project risk and project performance
Rehse et al. Process mining crimes–a threat to the validity of process discovery evaluations
Otieno et al. Framework to assess software quality in ERP systems
Chatzipetrou et al. Requirements’ Characteristics: How do they Impact on Project Budget in a Systems Engineering Context?
Rashmi et al. Defect Detection Efficiency: A Combined approach

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10763981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10763981

Country of ref document: EP

Kind code of ref document: A1