WO2019008600A1 - Task based estimator and tracker - Google Patents

Task based estimator and tracker Download PDF

Info

Publication number
WO2019008600A1
WO2019008600A1 PCT/IN2018/050434 IN2018050434W WO2019008600A1 WO 2019008600 A1 WO2019008600 A1 WO 2019008600A1 IN 2018050434 W IN2018050434 W IN 2018050434W WO 2019008600 A1 WO2019008600 A1 WO 2019008600A1
Authority
WO
WIPO (PCT)
Prior art keywords
task
proxy
user
software
project
Prior art date
Application number
PCT/IN2018/050434
Other languages
French (fr)
Inventor
Jaya MAKAM VENKATARATHNAM
Bharathi VASANTHAKRISHNA
Original Assignee
Kornerstone Analytics Pvt. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kornerstone Analytics Pvt. Ltd. filed Critical Kornerstone Analytics Pvt. Ltd.
Publication of WO2019008600A1 publication Critical patent/WO2019008600A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063112Skill-based matching of a person or a group to a task

Definitions

  • the present invention generally relates to effort estimation of a task in a software project and, more particularly, to a method and system for estimating effort required by a user to complete a task in the software project.
  • the software project could be a software development project, software testing project, a services/maintenance project, or a combination thereof.
  • the effort estimate may include, but are not limited to, number of users/engineers, hours and/or hardware required to complete the software project.
  • business managers make estimate of person-hours that will be required to complete the software project. This allows the business managers to set staffing levels and develop budgets that are consistent with their companies' strategic and financial plans.
  • the business managers estimate the effort by identifying and/or analyzing: objectives of the software project, complexity level, software project attributes, expertise/skills required for the project, skills of their current employees, size of the software project, history of effort required for similar project in past and the like.
  • the effort estimation process is a high level effort estimation of the software project. Therefore, the users/engineers associated with the software project are then given time frame to complete their assigned work according to the high level effort estimation.
  • the software projects routinely fail to be completed within estimated time frame even after considering a risk analysis approach to estimate the duration of the project. This failure is typically attributed to the uncertainty inherent in human activity, lack of inputs received by the users/engineers for effort estimation, and the like. Also, the failure occurs due to the non availability of data related to task based efforts expended by the user.
  • a method for completing a task of a software project assigned to a user possessing one or more software key skills includes receiving, by a processor, one or more project attributes and one or more task attributes. The method then enable selection of a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task. The plurality of proxy tasks is associated with a plurality of tasks definitions. Further, the method enables selection of a complexity of the task based on a complexity factors associated with the task.
  • the method thereafter identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. Further, the method determines an effort for completion of the task based on a prior effort required for completing the prior task.
  • an effort estimation system for completing a task of a software project assigned to a user possessing one or more software key skills.
  • the system includes a memory and a processor.
  • the memory stores instructions for effort estimation.
  • the memory also includes a plurality of proxy tasks, a plurality of prior tasks, one or more project attributes, one or more task attributes and a plurality of proxy task definitions.
  • the processor is operatively coupled with the at least one memory to fetch instructions from the memory for effort estimation.
  • the processor is configured to receive one or more attributes associated with the software project and one or more attributes associated with the task.
  • the processor the enables selection of a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task.
  • the processor also enables a complexity of the task based on a complexity factors associated with the task. Thereafter, the processor identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. The processor further determines an effort for completion of the task based on a prior effort required for completing the prior task.
  • an effort estimation computing device for completing a task of a software project by a user possessing one or more software key skills.
  • the computing device includes an Input and Output (I/O) device processor and a processor.
  • the I/O device receives one or more software project attributes, one or more task attributes, a proxy task and a complexity of the task.
  • the processor identifies a prior task from a plurality of task based on the proxy task, the complexity of the task, the one or more attributes associated with the task and the one or more skills associated with the user.
  • the processor determines an effort for completion of the task based on a prior effort required for completion of the prior task.
  • a non-transitory, computer-readable storage medium storing computer- executable program instructions to implement a method for completing a task of a software project by a user possessing one or more software key skills.
  • the computer implemented method accesses the task assigned to the user to obtain a scope of the task, one or more project attributes, one or more task attributes and a complexity factors associated with the task.
  • the computer implemented method then identifies a proxy task from a plurality of proxy tasks based on a similarity index of the scope of the task and a definition of the proxy task.
  • the plurality of proxy tasks is associated with a plurality of tasks definitions.
  • the computer implemented method further determines a complexity of the task based on the complexity factors associated with the task.
  • the computer implemented method identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. Further, the computer implemented method determines an effort for completion of the task by the user based on a prior effort required for completing the prior task.
  • FIG. 1 illustrates an exemplary block diagram depicting an effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention
  • FIG. 2 illustrates a block diagram depicting association of different tasks with different users, in accordance with an exemplary embodiment of the present invention
  • FIG. 3 illustrates a simplified representation of project attributes assignment, in accordance with an exemplary embodiment of the present invention
  • FIG. 4 illustrates a simplified representation of task attributes assignment, in accordance with an exemplary embodiment of the present invention
  • FIG. 5 illustrates a simplified representation of proxy selection, in accordance with an exemplary embodiment of the present invention
  • FIG. 6 illustrates a simplified representation of complexity assignment, in accordance with an exemplary embodiment of the present invention
  • FIGS. 7A and 7B illustrate screenshots for project attributes assignment, task attributes assignment, and capabilities assignment, in accordance with an exemplary embodiment of the present invention
  • FIG. 8 illustrates a block diagram depicting efforts estimation for a task assigned to a user, in accordance with an exemplary embodiment of the present invention
  • FIG. 9 illustrates a block diagram depicting effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention.
  • FIG. 10 illustrates an environment, in accordance with an exemplary embodiment of the present invention
  • FIG. 11 illustrates a system for effort estimation, in accordance with an exemplary embodiment of the present invention
  • FIG. 12 illustrates a method for effort estimation, in accordance with an exemplary embodiment of the present invention
  • FIG. 13 illustrates a method for effort estimation and performance analysis, in accordance with an exemplary embodiment of the present invention
  • FIG. 14 illustrates an effort estimation system, in accordance with an exemplary embodiment of the present invention
  • FIG. 15 illustrates an architecture for performing methods as described herein, in accordance with an exemplary embodiment of the present invention
  • FIGS. 16A and 16B illustrate screenshots for personal estimator and tracker, in accordance with an exemplary embodiment of the present invention.
  • FIG. 17 illustrates a system for performing methods as described herein, in accordance with an exemplary embodiment of the present invention.
  • FIG. 1 illustrates an exemplary block diagram 100 depicting an effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention.
  • the efforts estimation and performance analysis method may be either performed by a user device or by a server or by a combination.
  • An application that supports various workflows may be present on the user device or the server or on both. Examples of the user device include mobile phone, smartphone, tablet, laptop etc.
  • a task is associated with a user.
  • the task corresponds to a part or a portion of a software project 202 as shown in FIG. 2.
  • FIG. 2 illustrates a block diagram 200 depicting association of different tasks with different users, in accordance with an exemplary embodiment of the present invention.
  • the software project 202 includes one or more tasks 206.
  • the task includes or can be one or more units of work, i.e. software project 202.
  • the tasks 206 can be created by a project owner or a project manager or any other person authorized to do so.
  • the tasks 206 are then assigned to one or more users 216.
  • the tasks 206 are granularized to such an extent that one task is assigned to one user.
  • (Tl) 208 (hereinafter alternatively referred to as task 208) is assigned to a user (Ul) 218 (hereinafter alternatively referred to as user 218)
  • task (T2) 210 is assigned to a user (U2) 220
  • successively, task (Tn) 214 is assigned to a user (Un) 224.
  • the remaining disclosure is explained using the task 208 as an example and similar steps or functionalities are performed for other tasks too, either simultaneously or one after other or as appropriate.
  • mapping of many tasks to many users may also exist.
  • effort estimation for the task 208 is carried out and effort required to complete the task 208 is estimated.
  • the effort estimation includes comparing one or more attributes of the task 208 with that of tasks stored in a knowledge database.
  • the stored task having required or predetermined number of attributes matching attributes of the task 208 is identified as the stored task similar to the task 208.
  • Efforts, including time taken, spent in performing the stored task is then estimated as the efforts needed for completion of the task 208. Similar process of effort estimation is performed for all tasks one after other or simultaneously or in any other way possible.
  • the effort estimation includes various steps and is explained in detail later.
  • a footprint is created.
  • the footprint creation includes tracking actual progress of the user 218 on the task 208.
  • the footprint creation or the performance progress or the both includes comparing the tracked progress with the effort estimates to determine performance.
  • FIG. 3 illustrates a simplified representation 300 of project attributes assignment, in accordance with an exemplary embodiment of the present invention.
  • the task 208 is associated with the user 218.
  • the associations can be performed by the project manager based on one or more key skills of the user 218 known to the project manager.
  • other tasks are associated with other users.
  • the task assignor may call the users for a meeting to fill in relevant details in the application.
  • the users may be present at remote places and may fill the details in the application.
  • a screenshot 304 shows the project attributes on a user device associated with the user 218.
  • the attributes are assigned by the user 218.
  • the attributes assigning includes providing project name 306, defining role 308, assigning a platform 310, selecting an application 312, and mentioning lifecycle type 314. Examples of lifecycle type 314 includes waterfall type and spiral type.
  • the project attributes include a number of project components such as project name, role, platform etc.
  • a company project manager representing a team of 20 employees and managing a project of 'Customer Accounts' (CA) shall identify a team member, for example, the user 218, to develop an online client accounting screen.
  • developing the online client accounting screen for the customer may be considered as the task 208.
  • the onus of developing the task 208 is now is upon the user 218.
  • the user 218 shall install or may use the already installed application in his personal electronic device, for example, user device, to estimate efforts for developing the task 208 assigned by the project manager.
  • the user 218 provides one or more inputs once the user 218 has logged into the application. For example, details like role, name and location might be prompted by the application for the user 218 to address.
  • the application might also prompt questions to the user 218 for more details regarding the work environment. For example, operating environment, acquisition type, application type and language.
  • FIG. 4 illustrates a simplified representation 400 of task attributes assignment, in accordance with an exemplary embodiment of the present invention.
  • the user 218 assigns task attributes 402 to the task 208 as shown in a screenshot 404.
  • the task attributes 402 can be auto -populated based on task definitions in standard project management systems and the user 218 can then validate them.
  • the task attributes 402 include a number of task components.
  • the task attributes 402 include a task name 406, an activity type 408, a proxy 410 (also referred to as proxy task), a task complexity 412 (also referred to as complexity of the task), a software language 414, a capability 416, a start date 418, and a task description 420.
  • the proxy 410 indicates functionality that needs to be performed and is explained in details in FIG. 5.
  • the project herein is C A which may stand for 'Customer Accounts' .
  • the proxy 410 may be a screen that displays the details of the client online account with no calculated data.
  • the proxy 410 may also be a form wherein the form is an application screen wherein an end user of the software project 202 provides data input.
  • the proxy 410 may also be an interface wherein, number of boundaries across which two independent systems meet and act on or communicate with each other.
  • the screen, form and interface are various proxies that may be associated with the project CA.
  • the task complexity 412 indicates complexity of the task and can have values like “low”, “Nominal”, “High”, “very high”, “small”, “medium”, “huge”, “easy” and “tough”, and “complex” and “simple” and various other values.
  • the capability 416 indicates expertise or capability of the user 218 for performing the task 208.
  • the task name 406 is 'Test'
  • the activity type 408 is 'No Knowledge'
  • proxy 410 is 'Form'
  • the task complexity 412 is 'High'
  • the software language 414 is 'Java'
  • the capability 416 of the user 218 is 'Low'
  • the start date 418 of the task is 'August 10, 2017' .
  • the task complexity 412 of the task 208 is selected (by providing a user input) by the user 218 and it is then enabled by the processor in response to the user input.
  • the task complexity 412 is selected based on various complexity factors.
  • the complexity factors include at least one of a number of data elements, a number of tables, a number of function points, a number of interfaces, a number of software classes, a number of data structures, a number of inputs required, a number of outputs, a number of objects, a number of algorithms, a number of items, a number of test cases and a number of transactions.
  • FIG. 5 illustrates a simplified representation 500 of proxy selection, in accordance with an exemplary embodiment of the present invention.
  • Table 502 shows an example of the proxy 410 and corresponding definition.
  • option 504 with proxy named " ⁇ ' (Application Program Interface) and its corresponding definition is shows next column in table 502.
  • other proxy options names along with their definitions such as "Batch” (see, 506), "Defect” (see, 508), "Form” (see, 510), “Screen” (see, 512), “Use case points” (see, 514), and “User workflow” (see, 516) etc. are shown in Table 502.
  • the task 208 means something like a software code, software design, creating test data etc. Any technical work that a software engineer or developer or tester performs in order to produce a deliverable is defined as the task 208. Further, the task 208 is closely associated with the proxy 410 and represents one type of a work. Most tasks are defined to be small, representing no more than a few hours to a few days of work. Also, the tasks 206 are a component which is a part of a business function, approved by the project manager and is a part of a standard list of components.
  • the one or more proxies are used as an approximation or indirect measure of the task 208 at hand.
  • the one or more proxies are closely related to the task type and are available at the beginning of the task 208. For example, if the task 208 is to build a form which accepts inputs, then the user 218 must choose the proxy 410 which is closest to this definition.
  • the selection of the proxy 410 is enabled based on a similarity index of the scope of the proxy 410 and associated definition of the proxy 410.
  • the similarity index is a measure of similarity between the scope of the proxy 410 and the associated definition. The enabling is performed in response to receipt of user input selecting the proxy 410.
  • the user 218 knows the scope of the task 208 and the user 218 based on his knowledge selects the proxy 410 whose associated definition is similar to the scope.
  • the one or more proxies can help in creating mental pictures and can provide a level of abstraction. When engineers estimate task effort, engineers feel more secure about their estimates and associated commitments due to the one or more proxies.
  • FIG. 6 illustrates a simplified representation 600 of complexity assignment, in accordance with an exemplary embodiment of the present invention.
  • the task complexity 412 is shown in Table 602.
  • Table 602 shows three complexities, i.e. Low complexity 616, Nominal complexity 618, and High complexity 620, as an example.
  • Various rows of the table 602 show mapping of the task complexity 412 to the proxy 410.
  • row 604 shows "API" proxy (see, 504 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if members are less than 5 and functions are less than 2, or assign Nominal complexity 618 if members are between 5 and 12 and functions are 3 or 4, or assign High complexity 620 is the members are greater than 12 and functions are greater than 4.
  • other rows show different proxies and corresponding criteria for the task complexity 412.
  • the task complexity 412 can be based on various factors, for example, Data structures; Size of Task; Control structures; and Reusable code.
  • Row 606 and row 608 respectively show proxies "Batch” (see, 506 of FIG. 5) and "Defect" (see, 508 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if data elements are less than 5 and tables / interfaces are less than 2, or assign Nominal complexity 618 if data elements are between 5 and 12 and tables / interfaces are 3 or 4, or assign High complexity 620 is the data elements are greater than 12 and tables / interfaces are greater than 4.
  • row 612 shows proxy "Screen” (see, 512 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if data elements are less than 5, or assign Nominal complexity 618 if data elements are between 5 and 12, or assign High complexity 620 is the data elements are greater than 12.
  • classifying the task complexity 412 and skills of the task 208 into low, nominal or high can be fetched with reference to guidelines made by the user 218 or the project manager to define the task complexity 412 based on inputs, outputs, interfaces etc.
  • This data can be stored in the knowledge base 804 (as explained later with reference to FIG. 8) or any other database.
  • FIGS. 7 A and 7B illustrate screenshots for project attributes assignment, task attributes assignment, and capabilities assignment, in accordance with an exemplary embodiment.
  • FIG. 7A illustrates the screenshot 304 for project attributes assignment and the screenshot 404 for task attributes assignment.
  • the project attributes assigned by the user 218 include providing project name 306, defining role 308, assigning a platform 310, selecting an application 312, and mentioning lifecycle type 314.
  • the task attributes 402 include the task name 406, the activity type 408, the proxy 410, the task complexity 412, the software language 414, the capability 416, the start date 418, and the task description 420.
  • FIG. 7A further shows the user/my capabilities assignment at 702 (also shown as 416 in FIG. 4).
  • FIG. 7B shows a screenshot 752 for task attributes assignment, a screenshot 754 for my capabilities assignment, and a screenshot 756 for my capabilities assignment.
  • the application also considers collecting skill related parameters associated with the user 218 in order to perform the task 208.
  • the skill level may vary based on the work experience of the user 218. For example, if a senior programmer working on the task 208 takes leave, and a junior programmer takes his place, the actual amount of time for programming the task may be 50% higher than the original estimation. The senior programmer may have had the expertise and past experience in performing the task.
  • the factors taken into consideration when looking at the capability 416 may be language proficiency, domain proficiency, experience of performing similar tasks in the past and educational experience.
  • Another example, for assessment of capability or skills may be that the user 218 knows to code in Java but does not know how to develop the product in financial service domain, hence the user 218 marks his capability as 'Low' as shown in the screenshot 756.
  • FIG. 8 illustrates block diagram 800 depicting efforts estimation for a task assigned to a user, in accordance with an exemplary embodiment of the present invention.
  • an algorithm such as a fuzzy logic based algorithm triggers at 802.
  • the algorithm fetches historical data from a knowledge base 804 and determines a stored task that has a predetermined number of attributes matching with that of the task 208.
  • the knowledge base 804 can be created by collating data from various online or offline resources. Online resources include the inputs provided by the users or professionals via the application. Offline resources include the inputs collected manually from different people via paper surveys etc. and then fed into the knowledge base 804.
  • the knowledge base 804 can be created using any other possible means.
  • the knowledge base 804 can be managed by an entity other than the one managing the application or can be managed by the same entity.
  • the knowledge base 804 can also be provided as a separate offering.
  • the knowledge base 804 can be present at a location near the application or remote from the application.
  • the knowledge base 804 can be hosted in an intra net environment or internet environment or a combination of the both.
  • the knowledge base 804 can include data for the user 218 and also for other users.
  • the algorithm 802 can be a machine learning algorithm that considers data related to the user 218 for determining similar stored task.
  • algorithm 802 can consider data related to any user. Different weightages can be assigned to tasks done by different user with same user's task getting higher weightages as that represent more accurate estimation.
  • FIG. 9 illustrates a block diagram 900 depicting effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention.
  • a project attributes module 902 assigns project attributes 302 for the task 208.
  • a task attributes module 904 assigns task attributes 402 for the task 208. The modules assign the attributes in response to the inputs received from the user 218.
  • a proxy association module 906 associates the proxy 410 to the task 208.
  • a complexity module 908 assigns the task complexity 412 to the task 208.
  • a user /my capability assignment module 910 assigns the capability 416 to the task 208.
  • each module can be implemented using a separate processor or a combination of processors or by a single processor.
  • the proxy association module 906, the complexity module 908, and the user capability assignment module 910 can be a part of the task attributes module 904.
  • an algorithm module 912 determines a similar stored task that matches a predetermined number of attributes of the task 208. In some embodiments, the matching can be performed for certain selected attributes such as proxy etc., while in other embodiments all or some of the attributes can be used for matching. The user 218 or the project manager can drive the criteria for matching. The determining of similar stored task is performed using data accessible using a database module 914 and an archive module 916. The database module 914 and the archive module 916 has access to the knowledge base 804.
  • Prior task assigning effort module 918 assigns the efforts of the similar stored task to the task 208.
  • the efforts assigned to the task 208 use the prior efforts as an input and determine the efforts to be assigned to task 208 and hence, the determined efforts may not be exactly similar to the prior efforts but are based on that.
  • the assignment of similar stored task includes assigning the efforts, i.e. time etc., to the task 208 and hence, the user 218.
  • Actual effort tracking module 920 tracks actual efforts spent by the user 218.
  • the actual efforts can be inputted by the user 218 or can be fetched from a human resources application or any other employer application that tracks time and efforts of the user 218.
  • Performance analysis module 922 then compares the determined efforts with the actual efforts and based on the comparison, the performance analysis module 922 provides assessment for the user 218 and for the task 208.
  • the assessment can be in form of performance metric which is analyzed. For example, if the task 208 is on time or not, if the user 218 is productive or not etc. can be obtained as output from the performance analysis module 922.
  • the performance analysis module 922 can also provide comparative analysis for the user 218 against other users who performed similar stored task or against team member of the user 218 etc. It is to be appreciated that any other comparison or output can be provided.
  • FIG. 10 illustrates an environment 1000, in accordance with an exemplary embodiment of the present invention.
  • the environment 1000 includes one or more users 1002.
  • Each user has a computing device 1004 including an input/output device 1006 and a processor 1008.
  • the processor 1008 runs the application as described herein.
  • the users 1002 access an engine 1010 that includes association of attributes to the tasks 206 (as exemplarily depicted to be attributes El and All to Aln associated to Task 1, En and Aln to Ann associated to Task n etc.) and corresponding processing to identify effort estimates for the tasks 206 using a storage device 1012 including historical data.
  • the "effort estimates for the users" 1014 to perform the tasks 206 are then provided to respective users.
  • analysis 1016 is performed and made available to the users 1002 to indicate their performances.
  • FIG. 11 illustrates a system 1100 for effort estimation, in accordance with an exemplary embodiment of the present invention.
  • the project attributes 1102 (such as the project attributes 302 of FIG. 3) are used as an input for an analogy based estimation 1104.
  • the analogy based estimation 1104 of efforts also uses information from a knowledge base 1106.
  • the knowledge base 1106 receives data or time spent actuals 1110 for the user 218.
  • the actuals 1110 include the actual time spent by the user 218 on the task 208.
  • the information from the knowledge base 1106 is then used for analytics 1108.
  • the analytics 1108 include performance analysis for the user 218.
  • FIG. 12 illustrates a method 1200 for effort estimation, in accordance with an exemplary embodiment of the present invention.
  • step 1202 the method starts.
  • the method receives one or more project attributes 302 and one or more task attributes 402 at step 1204.
  • the attributes are received as an input from the user 218 using his/her user device.
  • the method enables selection of a proxy 410 for the task 208 and other tasks.
  • a task complexity 412 is also selected at step 1206.
  • a prior task (similar to stored task) is identified by matching attributes of the stored tasks with that of the task 208.
  • the stored task having similar attributes as the task 208 or having at least a predetermined number or type of similar attributes is identified as the prior task.
  • an effort for completion of the task 208 is then determined.
  • the efforts are determined as the efforts corresponding to the prior task. Since the prior task has similar attributes to that of the task 208, it is likely for the user 218 to spend same efforts in performing the task 208.
  • FIG. 13 illustrates a method 1300 for effort estimation and performance analysis, in accordance with an exemplary embodiment of the present invention.
  • the method starts at step 1302.
  • the project attributes 302 inputted by the user 218 or auto- populated by the application or received by some other means are accepted.
  • the screenshot 304 indicates the project attributes 302.
  • step 1306 the task attributes 402 inputted by the user 218 or auto- populated by the application or received by some other means are accepted.
  • the screenshot 404 indicates the task attributes 402.
  • the proxy 410 is associated with the task 208.
  • the proxy 410 indicates functionality that needs to be performed.
  • the proxy 410 may be a screen that displays the details of the client online account with no calculated data.
  • the proxy 410 may also be a form wherein the form is an application screen wherein an end user of the software project 202 provides data input.
  • the proxy 410 may also be an interface wherein, number of boundaries across which two independent systems meet and act on or communicate with each other.
  • the screen, form and interface are various proxies that may be associated with the respective project.
  • the task complexity 412 is classified.
  • the task complexity 412 can be low complexity 1312 or nominal complexity 1314 or high complexity 1316. Other possibilities are also there.
  • user capability/my capability 416 or 702 is classified.
  • the user capability /my capability 416 or 702 can be low capability 1328 or nominal capability 1330 or high capability 1332.
  • fuzzy logic based classification is performed to identify effort estimates and assign the same to the task 208.
  • the algorithm module 912 determines a similar stored task that matches a predetermined number of attributes of the task 208. In some embodiments, the matching can be performed for certain selected attributes such as proxy etc., while in other embodiments all or some of the attributes can be used for matching.
  • step 1322 actuals of the efforts spent by the user 208 are captured.
  • the actual effort tracking module 920 tracks actual efforts spent by the user 218.
  • the actual efforts can be inputted by the user 218 or can be fetched from a human resources application or any other employer application that tracks time and efforts of the user 218.
  • the analytics and performance analysis is performed.
  • the productivity of the user 218 is determined as part of the analysis and outputted.
  • the knowledge base 1106 receives actuals for the user 218.
  • the actuals include the actual time spent by the user 218 on the task 208.
  • the information from the knowledge base 1106 is then used for analytics and performance analysis for the user 218.
  • FIG. 14 illustrates an effort estimation system 1402, in accordance with an exemplary embodiment of the present invention.
  • the effort estimation system 1402 includes a memory 1404 for storing instructions for performing methods (e.g., methods 1200 and 1300) described herein.
  • the effort estimation system 1402 also includes a processor 1406 which reads instructions from the memory 1404 for performing the methods (e.g., methods 1200 and 1300) described herein.
  • FIG. 15 illustrates architecture 1500 for performing methods (e.g., methods 1200 and 1300) as described herein, in accordance with an exemplary embodiment of the present invention.
  • the architecture 1500 shows the devices, for example, 1502, 1504, and 1506, of the users including the application.
  • the application communicates via a gateway 1508 with the server including the functions for performing the methods described herein.
  • the functions include compute functions 1510, query functions 1514, store functions 1518 and analytics functions 1522.
  • the compute functions 1510 have access to storage device (see, S3) 1512 and perform computations using the data from the storage device 1512 for computing the attributes.
  • the query functions 1514 accesses data from Dynamo 1516 and from Mongo 1520 to query complexity and other attributes.
  • the store functions 1518 stores data into the Dynamo 1516 and the Mongo 1520.
  • the analytics functions 1522 perform the analysis using the data from the Mongo 1520.
  • FIGS. 16A and 16B illustrate screenshots 1600A and 1600B for personal estimator and tracker, in accordance with an exemplary embodiment of the present invention.
  • the screenshot 1600A displays the performance trend associated with the user 218 using the mobile phone for the proxy 410 ' Screen' . Using this trend, the user 218 can check his/her progress by studying the graph of task screen vs. actual effort.
  • the graph 1 of the screenshot 1600A depicts building screen 1 consumed 24 hours of effort time of the user 218.
  • the user 218 put only 24 hours of work time.
  • building screen 2 the user 218 took 15 hours.
  • the user 218 took 18 hours.
  • the user 218 / the employee took 32 hours.
  • building the screen 5 the user 218 took 12 hours.
  • the screenshot 1600A displays to the user 218 that to build screen 5 he took only 12 hours. And, whereas to build the screen 4 he took 32 hours. The maximum and minimum hours of efforts can easily be seen using the estimator and tracker records in the tool over any digital screen where the tool is downloaded into.
  • FIG. 16B is the screenshot 1600B wherein the user 218 can check the different type of proxies and the languages he has worked on by specifying time period.
  • a part of the graph depicts that the user 218 has worked on 15 Screens, 32 Test Scripts, 9 Workflow, 22 Reports, and 8 interfaces in the Language Python in the last 14 months.
  • FIG. 17 illustrates a system 1700 (hereinafter alternatively referred to as machine 1700 / computer system 1700) for performing methods as described herein, in accordance with an exemplary embodiment of the present invention.
  • FIG. 17 is a block diagram of a machine 1700, i.e. the user device or the server, in the example form of a computer system within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
  • the machine 1700 operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 1700 may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine 1700 may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • cellular telephone cellular telephone
  • web appliance a web appliance
  • network router switch or bridge
  • the example computer system 1700 includes a processor 1702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1704, and a static memory 1706, which communicate with each other via a bus 1708.
  • the computer system 1700 may further include a video display unit 1710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • LCD liquid crystal display
  • CRT cathode ray tube
  • the computer system 1700 also includes an alphanumeric input device 1712 (e.g., a keyboard), a user interface (UI) navigation device 1714 (e.g., a mouse), a disk drive unit 1716, a signal generation device 1718 (e.g., a speaker), and a network interface device 1720.
  • an alphanumeric input device 1712 e.g., a keyboard
  • UI user interface
  • disk drive unit 1716 e.g., a disk drive unit
  • signal generation device 1718 e.g., a speaker
  • network interface device 1720 e.g., a network interface device 1720.
  • the computer system 1700 may also include a environmental input device 1726 that may provide number of inputs describing the environment in which the computer system 1700 or another device exists, including, but not limited to, any of a Global Positioning Sensing (GPS) receiver, temperature sensor, a light sensor, a still photo or video camera, an audio sensor (e.g., microphone), a velocity sensor, a gyroscope, an accelerometer, and a compass.
  • GPS Global Positioning Sensing
  • the disk drive unit 1716 includes a machine -readable medium 1722 on which is stored one or more sets of data structures and instructions 1724 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the instructions 1724 may also reside, completely or at least partially, within the main memory 1704, the static memory 1706 and/or within the processor 1702 during execution thereof by the computer system 1700, the main memory 1704 and the processor 1702 also constituting machine -readable media.
  • machine -readable medium 1722 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1724 or data structures.
  • the term “non-transitory machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform anyone or more of the methodologies of the present subject matter, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such instructions.
  • non-transitory machine -readable medium shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.
  • Specific examples of non transitory machine -readable media include, but are not limited to, non-volatile memory, including by way of example, semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices), magnetic disks such as internal hard disks and removable disks, magneto -optical disks, and CD- ROM and DVD-ROM disks.
  • EPROM Erasable Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices
  • magnetic disks such as internal hard disks and removable disks, magneto -optical disks, and CD- ROM and DVD-ROM disks.
  • the instructions 1724 may further be transmitted or received over a computer network 1750 using a transmission medium.
  • the instructions 1724 may be transmitted using the network interface device 1720 and any one of a number of well-known transfer protocols (e.g., HTTP).
  • Examples of communication networks include a local area network (LAN), a wide area network(WAN), the Internet, mobile telephone networks, Plain Old Telephone Service (POTS)networks, and wireless data networks (e.g., WiFi and WiMAX networks).
  • POTS Plain Old Telephone Service
  • the term "transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
  • computer software products can be written in any of various suitable programming languages, such as C, C++, C#, Pascal, Fortran, Perl, Matlab (from Math Works), SAS, SPSS, JavaScript, AJAX, and Java.
  • the computer software product can be an independent application with data input and data display modules.
  • the computer software products can be classes that can be instantiated as distributed objects.
  • the computer software products can also be component software, for example Java Beans or Enterprise JavaBeans. Much functionality described herein can be implemented in computer software, computer hardware, or a combination.
  • a computer that is running the previously mentioned computer software can be connected to a network and can interface to other computers using the network.
  • the network can be an intranet, internet, or the Internet, among others.
  • the network can be a wired network (for example, using copper), telephone network, packet network, an optical network (for example, using optical fiber), or a wireless network, or a combination of such networks.
  • data and other information can be passed between the computer and components (or steps) of a system using a wireless network based on a protocol, for example Wi-Fi (IEEE standard 802.11 including its sub-standards a, b, e, g, h, i, n, et al.).
  • signals from the computer can be transferred, at least in part, wirelessly to components or other computers.
  • each illustrated component represents a collection of functionalities which can be implemented as software, hardware, firmware or any combination of these.
  • a component can be implemented as software, it can be implemented as a standalone program, but can also be implemented in other ways, for example as part of a larger program, as a plurality of separate programs, as a kernel loadable module, as one or more device drivers or as one or more statically or dynamically linked libraries.
  • FIG. 17 shows an exemplary structure and it could vary.
  • the environment input devices 1720, the user interface navigation device 1714, the disk drive unit 1716, and the signal generation device 1718 are optional and may not be present.

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A method, a system and a device for completing a task of a software project assigned to a user possessing one or more software key skills is disclosed. One or more project attributes and one or more task attributes are received. A selection of a proxy task from a plurality of proxy tasks is enabled based on a similarity index of a scope of the task and a definition of the proxy task. A selection of a complexity of the task is enabled based on a plurality of complexity factors. A prior task is identified based on the one or more attributes associated with the software project. An effort for completion of the task is determined based on a prior effort required for completing the prior task.

Description

TASK BASED ESTIMATOR AND TRACKER
TECHNICAL FIELD
[0001] The present invention generally relates to effort estimation of a task in a software project and, more particularly, to a method and system for estimating effort required by a user to complete a task in the software project.
BACKGROUND
[0002] Today, organizations providing software services and developing software products need to estimate effort required to complete a specific software project before initiation of such project. The software project could be a software development project, software testing project, a services/maintenance project, or a combination thereof. The effort estimate may include, but are not limited to, number of users/engineers, hours and/or hardware required to complete the software project.
[0003] For example, in order to get an effort estimate for the completion of the software project, business managers make estimate of person-hours that will be required to complete the software project. This allows the business managers to set staffing levels and develop budgets that are consistent with their companies' strategic and financial plans. The business managers estimate the effort by identifying and/or analyzing: objectives of the software project, complexity level, software project attributes, expertise/skills required for the project, skills of their current employees, size of the software project, history of effort required for similar project in past and the like.
[0004] Hence, currently the effort estimation process is a high level effort estimation of the software project. Therefore, the users/engineers associated with the software project are then given time frame to complete their assigned work according to the high level effort estimation. However, it has been observed that the software projects, routinely fail to be completed within estimated time frame even after considering a risk analysis approach to estimate the duration of the project. This failure is typically attributed to the uncertainty inherent in human activity, lack of inputs received by the users/engineers for effort estimation, and the like. Also, the failure occurs due to the non availability of data related to task based efforts expended by the user.
[0005] Such failures impact the reputation of the organization and also users' individual performance appraisals process when they are evaluated on their assigned work. Hence, there exists a strong need for reliable approach for estimating effort required to complete tasks in the software project.
SUMMARY
[0006] Various methods, systems and devices for completing a task of a software project assigned to a user possessing one or more software key skills are disclosed.
[0007] In an embodiment, a method for completing a task of a software project assigned to a user possessing one or more software key skills is disclosed. The method includes receiving, by a processor, one or more project attributes and one or more task attributes. The method then enable selection of a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task. The plurality of proxy tasks is associated with a plurality of tasks definitions. Further, the method enables selection of a complexity of the task based on a complexity factors associated with the task. The method thereafter identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. Further, the method determines an effort for completion of the task based on a prior effort required for completing the prior task.
[0008] In another embodiment, an effort estimation system for completing a task of a software project assigned to a user possessing one or more software key skills is disclosed. The system includes a memory and a processor. The memory stores instructions for effort estimation. The memory also includes a plurality of proxy tasks, a plurality of prior tasks, one or more project attributes, one or more task attributes and a plurality of proxy task definitions. The processor is operatively coupled with the at least one memory to fetch instructions from the memory for effort estimation. The processor is configured to receive one or more attributes associated with the software project and one or more attributes associated with the task. The processor the enables selection of a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task. The processor also enables a complexity of the task based on a complexity factors associated with the task. Thereafter, the processor identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. The processor further determines an effort for completion of the task based on a prior effort required for completing the prior task.
[0009] In yet another embodiment, an effort estimation computing device for completing a task of a software project by a user possessing one or more software key skills is disclosed. The computing device includes an Input and Output (I/O) device processor and a processor. The I/O device receives one or more software project attributes, one or more task attributes, a proxy task and a complexity of the task. The processor identifies a prior task from a plurality of task based on the proxy task, the complexity of the task, the one or more attributes associated with the task and the one or more skills associated with the user. The processor then determines an effort for completion of the task based on a prior effort required for completion of the prior task.
[0010] In still another embodiment, a non-transitory, computer-readable storage medium storing computer- executable program instructions to implement a method for completing a task of a software project by a user possessing one or more software key skills is provided. The computer implemented method accesses the task assigned to the user to obtain a scope of the task, one or more project attributes, one or more task attributes and a complexity factors associated with the task. The computer implemented method then identifies a proxy task from a plurality of proxy tasks based on a similarity index of the scope of the task and a definition of the proxy task. The plurality of proxy tasks is associated with a plurality of tasks definitions. The computer implemented method further determines a complexity of the task based on the complexity factors associated with the task. Thereafter, the computer implemented method identifies a prior task from a plurality of prior tasks based on the one or more attributes associated with the software project, the one or more attributes associated with the task, the selected proxy task, the selected complexity of the task, and the one or more software key skills possessed by the user. Further, the computer implemented method determines an effort for completion of the task by the user based on a prior effort required for completing the prior task.
[0011] Other aspects and example embodiments are provided in the drawings and the detailed description that follows.
BRIEF DESCRIPTION OF THE FIGURES
[0012] For a more complete understanding of example embodiments of the present technology, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
[0013] FIG. 1 illustrates an exemplary block diagram depicting an effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention;
[0014] FIG. 2 illustrates a block diagram depicting association of different tasks with different users, in accordance with an exemplary embodiment of the present invention;
[0015] FIG. 3 illustrates a simplified representation of project attributes assignment, in accordance with an exemplary embodiment of the present invention;
[0016] FIG. 4 illustrates a simplified representation of task attributes assignment, in accordance with an exemplary embodiment of the present invention;
[0017] FIG. 5 illustrates a simplified representation of proxy selection, in accordance with an exemplary embodiment of the present invention;
[0018] FIG. 6 illustrates a simplified representation of complexity assignment, in accordance with an exemplary embodiment of the present invention;
[0019] FIGS. 7A and 7B illustrate screenshots for project attributes assignment, task attributes assignment, and capabilities assignment, in accordance with an exemplary embodiment of the present invention;
[0020] FIG. 8 illustrates a block diagram depicting efforts estimation for a task assigned to a user, in accordance with an exemplary embodiment of the present invention;
[0021] FIG. 9 illustrates a block diagram depicting effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention;
[0022] FIG. 10 illustrates an environment, in accordance with an exemplary embodiment of the present invention;
[0023] FIG. 11 illustrates a system for effort estimation, in accordance with an exemplary embodiment of the present invention;
[0024] FIG. 12 illustrates a method for effort estimation, in accordance with an exemplary embodiment of the present invention;
[0025] FIG. 13 illustrates a method for effort estimation and performance analysis, in accordance with an exemplary embodiment of the present invention;
[0026] FIG. 14 illustrates an effort estimation system, in accordance with an exemplary embodiment of the present invention;
[0027] FIG. 15 illustrates an architecture for performing methods as described herein, in accordance with an exemplary embodiment of the present invention;
[0028] FIGS. 16A and 16B illustrate screenshots for personal estimator and tracker, in accordance with an exemplary embodiment of the present invention; and
[0029] FIG. 17 illustrates a system for performing methods as described herein, in accordance with an exemplary embodiment of the present invention.
[0030] The drawings referred to in this description are not to be understood as being drawn to scale except if specifically noted, and such drawings are only exemplary in nature. DETAILED DESCRIPTION
[0031] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details. In other instances, apparatuses and methods are shown in block diagram form only in order to avoid obscuring the present disclosure.
[0032] Reference in this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, others. Similarly, various requirements various features are described which may be exhibited by some embodiments and not by are described which may be requirements for some embodiments but not other embodiments.
[0033] Moreover, although the following description contains many specifics for the purposes of illustration, anyone skilled in the art will appreciate that many variations and/or alterations to said details are within the scope of the present disclosure. Similarly, although many of the features of the present disclosure are described in terms of each other, or in conjunction with each other, one skilled in the art will appreciate that many of these features can be provided independently of other features. Accordingly, this description of the present disclosure is set forth without any loss of generality to, and without imposing limitations upon, the present disclosure.
[0034] FIG. 1 illustrates an exemplary block diagram 100 depicting an effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention. The efforts estimation and performance analysis method may be either performed by a user device or by a server or by a combination. An application that supports various workflows may be present on the user device or the server or on both. Examples of the user device include mobile phone, smartphone, tablet, laptop etc. [0035] At block 102, a task is associated with a user. The task corresponds to a part or a portion of a software project 202 as shown in FIG. 2. FIG. 2 illustrates a block diagram 200 depicting association of different tasks with different users, in accordance with an exemplary embodiment of the present invention. The software project 202 includes one or more tasks 206. The task includes or can be one or more units of work, i.e. software project 202. The tasks 206 can be created by a project owner or a project manager or any other person authorized to do so. The tasks 206 are then assigned to one or more users 216. In one embodiment, the tasks 206 are granularized to such an extent that one task is assigned to one user. For example task, (Tl) 208 (hereinafter alternatively referred to as task 208) is assigned to a user (Ul) 218 (hereinafter alternatively referred to as user 218), task (T2) 210 is assigned to a user (U2) 220, and successively, task (Tn) 214 is assigned to a user (Un) 224. The remaining disclosure is explained using the task 208 as an example and similar steps or functionalities are performed for other tasks too, either simultaneously or one after other or as appropriate. In another embodiment, mapping of many tasks to many users may also exist.
[0036] At block 104, effort estimation for the task 208 is carried out and effort required to complete the task 208 is estimated. The effort estimation includes comparing one or more attributes of the task 208 with that of tasks stored in a knowledge database. The stored task having required or predetermined number of attributes matching attributes of the task 208 is identified as the stored task similar to the task 208. Efforts, including time taken, spent in performing the stored task is then estimated as the efforts needed for completion of the task 208. Similar process of effort estimation is performed for all tasks one after other or simultaneously or in any other way possible. The effort estimation includes various steps and is explained in detail later.
[0037] At block 106, a footprint is created. The footprint creation includes tracking actual progress of the user 218 on the task 208.
[0038] At block 108, the performance progress is viewed.
[0039] The footprint creation or the performance progress or the both includes comparing the tracked progress with the effort estimates to determine performance.
[0040] FIG. 3 illustrates a simplified representation 300 of project attributes assignment, in accordance with an exemplary embodiment of the present invention. As explained with reference to FIGS. 1 and 2, the task 208 is associated with the user 218. The associations can be performed by the project manager based on one or more key skills of the user 218 known to the project manager. Similarly, other tasks are associated with other users. In one embodiment, the task assignor may call the users for a meeting to fill in relevant details in the application. In another embodiment, the users may be present at remote places and may fill the details in the application.
[0041] At 302, project attributes are assigned for the task 208. A screenshot 304 shows the project attributes on a user device associated with the user 218. The attributes are assigned by the user 218. The attributes assigning includes providing project name 306, defining role 308, assigning a platform 310, selecting an application 312, and mentioning lifecycle type 314. Examples of lifecycle type 314 includes waterfall type and spiral type. The project attributes include a number of project components such as project name, role, platform etc.
[0042] Considering a scenario wherein, a company project manager representing a team of 20 employees and managing a project of 'Customer Accounts' (CA) shall identify a team member, for example, the user 218, to develop an online client accounting screen. Thus, developing the online client accounting screen for the customer may be considered as the task 208. The onus of developing the task 208 is now is upon the user 218. Thus, the user 218 shall install or may use the already installed application in his personal electronic device, for example, user device, to estimate efforts for developing the task 208 assigned by the project manager. The user 218 provides one or more inputs once the user 218 has logged into the application. For example, details like role, name and location might be prompted by the application for the user 218 to address. The application might also prompt questions to the user 218 for more details regarding the work environment. For example, operating environment, acquisition type, application type and language.
[0043] FIG. 4 illustrates a simplified representation 400 of task attributes assignment, in accordance with an exemplary embodiment of the present invention. In one embodiment, the user 218 assigns task attributes 402 to the task 208 as shown in a screenshot 404. In another embodiment, the task attributes 402 can be auto -populated based on task definitions in standard project management systems and the user 218 can then validate them. The task attributes 402 include a number of task components. For example, the task attributes 402 include a task name 406, an activity type 408, a proxy 410 (also referred to as proxy task), a task complexity 412 (also referred to as complexity of the task), a software language 414, a capability 416, a start date 418, and a task description 420.
[0044] The proxy 410 indicates functionality that needs to be performed and is explained in details in FIG. 5. The project herein is C A which may stand for 'Customer Accounts' . For example, the proxy 410 may be a screen that displays the details of the client online account with no calculated data. In another example, the proxy 410 may also be a form wherein the form is an application screen wherein an end user of the software project 202 provides data input. In yet another example, the proxy 410 may also be an interface wherein, number of boundaries across which two independent systems meet and act on or communicate with each other. Thus, the screen, form and interface are various proxies that may be associated with the project CA.
[0045] The task complexity 412 indicates complexity of the task and can have values like "low", "Nominal", "High", "very high", "small", "medium", "huge", "easy" and "tough", and "complex" and "simple" and various other values. The capability 416 indicates expertise or capability of the user 218 for performing the task 208. As exemplarily shown, the task name 406 is 'Test', the activity type 408 is 'No Knowledge', proxy 410 is 'Form', the task complexity 412 is 'High', the software language 414 is 'Java', the capability 416 of the user 218 is 'Low', and the start date 418 of the task is 'August 10, 2017' . The task complexity 412 of the task 208 is selected (by providing a user input) by the user 218 and it is then enabled by the processor in response to the user input. The task complexity 412 is selected based on various complexity factors. The complexity factors include at least one of a number of data elements, a number of tables, a number of function points, a number of interfaces, a number of software classes, a number of data structures, a number of inputs required, a number of outputs, a number of objects, a number of algorithms, a number of items, a number of test cases and a number of transactions.
[0046] FIG. 5 illustrates a simplified representation 500 of proxy selection, in accordance with an exemplary embodiment of the present invention. Various options for selecting the proxy 410 are available to the user 218. Table 502 shows an example of the proxy 410 and corresponding definition. For example, option 504 with proxy named "ΑΡΓ' (Application Program Interface) and its corresponding definition is shows next column in table 502. Similarly, other proxy options names along with their definitions such as "Batch" (see, 506), "Defect" (see, 508), "Form" (see, 510), "Screen" (see, 512), "Use case points" (see, 514), and "User workflow" (see, 516) etc. are shown in Table 502.
[0047] In continuation to the above scenario described in FIG. 3 and FIG. 4, developing the user interface for password encryption for login functionality for the client account can be considered as the task 208. Thus, the task 208 means something like a software code, software design, creating test data etc. Any technical work that a software engineer or developer or tester performs in order to produce a deliverable is defined as the task 208. Further, the task 208 is closely associated with the proxy 410 and represents one type of a work. Most tasks are defined to be small, representing no more than a few hours to a few days of work. Also, the tasks 206 are a component which is a part of a business function, approved by the project manager and is a part of a standard list of components.
[0048] In some embodiments, the one or more proxies are used as an approximation or indirect measure of the task 208 at hand. The one or more proxies are closely related to the task type and are available at the beginning of the task 208. For example, if the task 208 is to build a form which accepts inputs, then the user 218 must choose the proxy 410 which is closest to this definition. The selection of the proxy 410 is enabled based on a similarity index of the scope of the proxy 410 and associated definition of the proxy 410. The similarity index is a measure of similarity between the scope of the proxy 410 and the associated definition. The enabling is performed in response to receipt of user input selecting the proxy 410. The user 218 knows the scope of the task 208 and the user 218 based on his knowledge selects the proxy 410 whose associated definition is similar to the scope. In some embodiments, the one or more proxies can help in creating mental pictures and can provide a level of abstraction. When engineers estimate task effort, engineers feel more secure about their estimates and associated commitments due to the one or more proxies.
[0049] FIG. 6 illustrates a simplified representation 600 of complexity assignment, in accordance with an exemplary embodiment of the present invention. The task complexity 412 is shown in Table 602. Table 602 shows three complexities, i.e. Low complexity 616, Nominal complexity 618, and High complexity 620, as an example. Various rows of the table 602 show mapping of the task complexity 412 to the proxy 410. For example, row 604 shows "API" proxy (see, 504 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if members are less than 5 and functions are less than 2, or assign Nominal complexity 618 if members are between 5 and 12 and functions are 3 or 4, or assign High complexity 620 is the members are greater than 12 and functions are greater than 4. Similarly, other rows show different proxies and corresponding criteria for the task complexity 412. The task complexity 412 can be based on various factors, for example, Data structures; Size of Task; Control structures; and Reusable code.
[0050] Row 606 and row 608 respectively show proxies "Batch" (see, 506 of FIG. 5) and "Defect" (see, 508 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if data elements are less than 5 and tables / interfaces are less than 2, or assign Nominal complexity 618 if data elements are between 5 and 12 and tables / interfaces are 3 or 4, or assign High complexity 620 is the data elements are greater than 12 and tables / interfaces are greater than 4. Similarly, row 612 shows proxy "Screen" (see, 512 of FIG. 5) for which the user 218 needs to assign Low complexity 616 if data elements are less than 5, or assign Nominal complexity 618 if data elements are between 5 and 12, or assign High complexity 620 is the data elements are greater than 12.
[0051] Further, classifying the task complexity 412 and skills of the task 208 into low, nominal or high can be fetched with reference to guidelines made by the user 218 or the project manager to define the task complexity 412 based on inputs, outputs, interfaces etc. This data can be stored in the knowledge base 804 (as explained later with reference to FIG. 8) or any other database.
[0052] FIGS. 7 A and 7B illustrate screenshots for project attributes assignment, task attributes assignment, and capabilities assignment, in accordance with an exemplary embodiment. FIG. 7A illustrates the screenshot 304 for project attributes assignment and the screenshot 404 for task attributes assignment. As explained with reference to FIG. 3, the project attributes assigned by the user 218 include providing project name 306, defining role 308, assigning a platform 310, selecting an application 312, and mentioning lifecycle type 314. Similarly, as explained with reference to FIG. 4, the task attributes 402 include the task name 406, the activity type 408, the proxy 410, the task complexity 412, the software language 414, the capability 416, the start date 418, and the task description 420. FIG. 7A further shows the user/my capabilities assignment at 702 (also shown as 416 in FIG. 4).
[0053] FIG. 7B shows a screenshot 752 for task attributes assignment, a screenshot 754 for my capabilities assignment, and a screenshot 756 for my capabilities assignment. In one embodiment, the application also considers collecting skill related parameters associated with the user 218 in order to perform the task 208. The skill level may vary based on the work experience of the user 218. For example, if a senior programmer working on the task 208 takes leave, and a junior programmer takes his place, the actual amount of time for programming the task may be 50% higher than the original estimation. The senior programmer may have had the expertise and past experience in performing the task. As shown in the screenshot 754, the factors taken into consideration when looking at the capability 416 may be language proficiency, domain proficiency, experience of performing similar tasks in the past and educational experience. Another example, for assessment of capability or skills may be that the user 218 knows to code in Java but does not know how to develop the product in financial service domain, hence the user 218 marks his capability as 'Low' as shown in the screenshot 756.
[0054] FIG. 8 illustrates block diagram 800 depicting efforts estimation for a task assigned to a user, in accordance with an exemplary embodiment of the present invention. After the user 218 assigns task attributes at 402 for the task 208, an algorithm such as a fuzzy logic based algorithm triggers at 802. The algorithm fetches historical data from a knowledge base 804 and determines a stored task that has a predetermined number of attributes matching with that of the task 208. The knowledge base 804 can be created by collating data from various online or offline resources. Online resources include the inputs provided by the users or professionals via the application. Offline resources include the inputs collected manually from different people via paper surveys etc. and then fed into the knowledge base 804. It is to be appreciated that the knowledge base 804 can be created using any other possible means. The knowledge base 804 can be managed by an entity other than the one managing the application or can be managed by the same entity. The knowledge base 804 can also be provided as a separate offering. The knowledge base 804 can be present at a location near the application or remote from the application. The knowledge base 804 can be hosted in an intra net environment or internet environment or a combination of the both. After the stored task (prior task) is determined, the prior efforts (time etc.) spent on the stored task is used to determine efforts required for the task 208 and assigned to the task 208 as the determined efforts.
[0055] The knowledge base 804 can include data for the user 218 and also for other users. In one embodiment, the algorithm 802 can be a machine learning algorithm that considers data related to the user 218 for determining similar stored task. In another embodiment, algorithm 802 can consider data related to any user. Different weightages can be assigned to tasks done by different user with same user's task getting higher weightages as that represent more accurate estimation.
[0056] FIG. 9 illustrates a block diagram 900 depicting effort estimation and performance analysis method, in accordance with an exemplary embodiment of the present invention. A project attributes module 902 assigns project attributes 302 for the task 208. A task attributes module 904 assigns task attributes 402 for the task 208. The modules assign the attributes in response to the inputs received from the user 218. A proxy association module 906 associates the proxy 410 to the task 208. A complexity module 908 assigns the task complexity 412 to the task 208. A user /my capability assignment module 910 assigns the capability 416 to the task 208. In some embodiments, each module can be implemented using a separate processor or a combination of processors or by a single processor. Also, the proxy association module 906, the complexity module 908, and the user capability assignment module 910 can be a part of the task attributes module 904.
[0057] After the attributes allocation, an algorithm module 912 determines a similar stored task that matches a predetermined number of attributes of the task 208. In some embodiments, the matching can be performed for certain selected attributes such as proxy etc., while in other embodiments all or some of the attributes can be used for matching. The user 218 or the project manager can drive the criteria for matching. The determining of similar stored task is performed using data accessible using a database module 914 and an archive module 916. The database module 914 and the archive module 916 has access to the knowledge base 804.
[0058] Prior task assigning effort module 918 assigns the efforts of the similar stored task to the task 208. In some embodiments, the efforts assigned to the task 208 use the prior efforts as an input and determine the efforts to be assigned to task 208 and hence, the determined efforts may not be exactly similar to the prior efforts but are based on that. The assignment of similar stored task includes assigning the efforts, i.e. time etc., to the task 208 and hence, the user 218.
[0059] Actual effort tracking module 920 tracks actual efforts spent by the user 218. The actual efforts can be inputted by the user 218 or can be fetched from a human resources application or any other employer application that tracks time and efforts of the user 218. Performance analysis module 922 then compares the determined efforts with the actual efforts and based on the comparison, the performance analysis module 922 provides assessment for the user 218 and for the task 208. The assessment can be in form of performance metric which is analyzed. For example, if the task 208 is on time or not, if the user 218 is productive or not etc. can be obtained as output from the performance analysis module 922. In some embodiments, the performance analysis module 922 can also provide comparative analysis for the user 218 against other users who performed similar stored task or against team member of the user 218 etc. It is to be appreciated that any other comparison or output can be provided.
[0060] Considering the user 218 has tracked his performance with the help of the application over the past 6 months and has observed a steady increase in productivity in developing tasks of the proxy 410 type ' Screen' '(see, 512 of FIG. 5). The user 218 observes that his performance has benefitted the team's overall execution of the project CA. The user 218 now wants to compare his productivity with other teams in different locations for the proxy 410 ' Screen' (see, 512 of FIG. 5) and language 414 'Java' . The application thus provides a facility wherein, the user 218 can join communities who are users of this application and provide him with a facility to compare his productivity with his peers/friends/globally. The application shall consolidate all the individual productivity and display electronically to the user 218 where his performance can be rated compared to the peers. For example, the application displays that within the 'Team' community the user 218 is within top 40 percentile and when compared against the 'India' community the user 218 is at top 20 percentile. This result displayed by the application to the user 218 shows the assessment from which the user 218 can benefit due to objective feedback. [0061] FIG. 10 illustrates an environment 1000, in accordance with an exemplary embodiment of the present invention. The environment 1000 includes one or more users 1002. Each user has a computing device 1004 including an input/output device 1006 and a processor 1008. The processor 1008 runs the application as described herein. The users 1002 access an engine 1010 that includes association of attributes to the tasks 206 (as exemplarily depicted to be attributes El and All to Aln associated to Task 1, En and Aln to Ann associated to Task n etc.) and corresponding processing to identify effort estimates for the tasks 206 using a storage device 1012 including historical data. The "effort estimates for the users" 1014 to perform the tasks 206 are then provided to respective users. Also, analysis 1016 is performed and made available to the users 1002 to indicate their performances.
[0062] FIG. 11 illustrates a system 1100 for effort estimation, in accordance with an exemplary embodiment of the present invention. The project attributes 1102 (such as the project attributes 302 of FIG. 3) are used as an input for an analogy based estimation 1104. The analogy based estimation 1104 of efforts also uses information from a knowledge base 1106. The knowledge base 1106 receives data or time spent actuals 1110 for the user 218. The actuals 1110 include the actual time spent by the user 218 on the task 208. The information from the knowledge base 1106 is then used for analytics 1108. The analytics 1108 include performance analysis for the user 218.
[0063] FIG. 12 illustrates a method 1200 for effort estimation, in accordance with an exemplary embodiment of the present invention.
[0064] At step 1202, the method starts.
[0065] The method receives one or more project attributes 302 and one or more task attributes 402 at step 1204. The attributes are received as an input from the user 218 using his/her user device.
[0066] At step 1206, the method enables selection of a proxy 410 for the task 208 and other tasks. A task complexity 412 is also selected at step 1206.
[0067] At step 1208, a prior task (similar to stored task) is identified by matching attributes of the stored tasks with that of the task 208. The stored task having similar attributes as the task 208 or having at least a predetermined number or type of similar attributes is identified as the prior task.
[0068] At step 1210, an effort for completion of the task 208 is then determined. The efforts are determined as the efforts corresponding to the prior task. Since the prior task has similar attributes to that of the task 208, it is likely for the user 218 to spend same efforts in performing the task 208.
[0069] The method stops at step 1212.
[0070] FIG. 13 illustrates a method 1300 for effort estimation and performance analysis, in accordance with an exemplary embodiment of the present invention.
[0071] The method starts at step 1302.
[0072] At step 1304, the project attributes 302 inputted by the user 218 or auto- populated by the application or received by some other means are accepted. The screenshot 304 indicates the project attributes 302.
[0073] At step 1306, the task attributes 402 inputted by the user 218 or auto- populated by the application or received by some other means are accepted. The screenshot 404 indicates the task attributes 402.
[0074] At step 1308, the proxy 410 is associated with the task 208. As explained with reference to FIGS. 4 and 5, the proxy 410 indicates functionality that needs to be performed. For example, the proxy 410 may be a screen that displays the details of the client online account with no calculated data. In another example, the proxy 410 may also be a form wherein the form is an application screen wherein an end user of the software project 202 provides data input. In yet another example, the proxy 410 may also be an interface wherein, number of boundaries across which two independent systems meet and act on or communicate with each other. Thus, the screen, form and interface are various proxies that may be associated with the respective project.
[0075] At step 1310, the task complexity 412 is classified. The task complexity 412 can be low complexity 1312 or nominal complexity 1314 or high complexity 1316. Other possibilities are also there.
[0076] At step 1318, user capability/my capability 416 or 702 is classified. The user capability /my capability 416 or 702 can be low capability 1328 or nominal capability 1330 or high capability 1332.
[0077] At step 1320, fuzzy logic based classification is performed to identify effort estimates and assign the same to the task 208. As explained with reference to FIG. 9, the algorithm module 912 determines a similar stored task that matches a predetermined number of attributes of the task 208. In some embodiments, the matching can be performed for certain selected attributes such as proxy etc., while in other embodiments all or some of the attributes can be used for matching.
[0078] At step 1322, actuals of the efforts spent by the user 208 are captured. As explained with reference to FIG. 9, the actual effort tracking module 920 tracks actual efforts spent by the user 218. The actual efforts can be inputted by the user 218 or can be fetched from a human resources application or any other employer application that tracks time and efforts of the user 218.
[0079] At step 1324, the analytics and performance analysis is performed. The productivity of the user 218 is determined as part of the analysis and outputted. As explained with reference to FIG. 11, the knowledge base 1106 receives actuals for the user 218. The actuals include the actual time spent by the user 218 on the task 208. The information from the knowledge base 1106 is then used for analytics and performance analysis for the user 218.
[0080] The method stops at step 1326.
[0081] FIG. 14 illustrates an effort estimation system 1402, in accordance with an exemplary embodiment of the present invention. The effort estimation system 1402 includes a memory 1404 for storing instructions for performing methods (e.g., methods 1200 and 1300) described herein. The effort estimation system 1402 also includes a processor 1406 which reads instructions from the memory 1404 for performing the methods (e.g., methods 1200 and 1300) described herein.
[0082] FIG. 15 illustrates architecture 1500 for performing methods (e.g., methods 1200 and 1300) as described herein, in accordance with an exemplary embodiment of the present invention. The architecture 1500 shows the devices, for example, 1502, 1504, and 1506, of the users including the application. The application communicates via a gateway 1508 with the server including the functions for performing the methods described herein. The functions include compute functions 1510, query functions 1514, store functions 1518 and analytics functions 1522. The compute functions 1510 have access to storage device (see, S3) 1512 and perform computations using the data from the storage device 1512 for computing the attributes. The query functions 1514 accesses data from Dynamo 1516 and from Mongo 1520 to query complexity and other attributes. The store functions 1518 stores data into the Dynamo 1516 and the Mongo 1520. The analytics functions 1522 perform the analysis using the data from the Mongo 1520.
[0083] FIGS. 16A and 16B illustrate screenshots 1600A and 1600B for personal estimator and tracker, in accordance with an exemplary embodiment of the present invention. The screenshot 1600A displays the performance trend associated with the user 218 using the mobile phone for the proxy 410 ' Screen' . Using this trend, the user 218 can check his/her progress by studying the graph of task screen vs. actual effort.
[0084] For example, considering the task for the proxy 410 ' Screen' , the graph 1 of the screenshot 1600A depicts building screen 1 consumed 24 hours of effort time of the user 218. Alternatively, it can also be considered that to build the screen 1 the user 218 put only 24 hours of work time. Similarly, building screen 2 the user 218 took 15 hours. For building screen 3 the user 218 took 18 hours. For building screen 4, the user 218 / the employee took 32 hours. And for finally building the screen 5, the user 218 took 12 hours.
[0085] Thus, the screenshot 1600A displays to the user 218 that to build screen 5 he took only 12 hours. And, whereas to build the screen 4 he took 32 hours. The maximum and minimum hours of efforts can easily be seen using the estimator and tracker records in the tool over any digital screen where the tool is downloaded into.
[0086] Similarly, FIG. 16B is the screenshot 1600B wherein the user 218 can check the different type of proxies and the languages he has worked on by specifying time period. A part of the graph depicts that the user 218 has worked on 15 Screens, 32 Test Scripts, 9 Workflow, 22 Reports, and 8 interfaces in the Language Python in the last 14 months.
[0087] FIG. 17 illustrates a system 1700 (hereinafter alternatively referred to as machine 1700 / computer system 1700) for performing methods as described herein, in accordance with an exemplary embodiment of the present invention. FIG. 17 is a block diagram of a machine 1700, i.e. the user device or the server, in the example form of a computer system within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine 1700 operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1700 may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 1700 may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
[0088] The example computer system 1700 includes a processor 1702 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1704, and a static memory 1706, which communicate with each other via a bus 1708. The computer system 1700 may further include a video display unit 1710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1700 also includes an alphanumeric input device 1712 (e.g., a keyboard), a user interface (UI) navigation device 1714 (e.g., a mouse), a disk drive unit 1716, a signal generation device 1718 (e.g., a speaker), and a network interface device 1720. The computer system 1700 may also include a environmental input device 1726 that may provide number of inputs describing the environment in which the computer system 1700 or another device exists, including, but not limited to, any of a Global Positioning Sensing (GPS) receiver, temperature sensor, a light sensor, a still photo or video camera, an audio sensor (e.g., microphone), a velocity sensor, a gyroscope, an accelerometer, and a compass.
Machine-Readable Medium
[0089] The disk drive unit 1716 includes a machine -readable medium 1722 on which is stored one or more sets of data structures and instructions 1724 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 1724 may also reside, completely or at least partially, within the main memory 1704, the static memory 1706 and/or within the processor 1702 during execution thereof by the computer system 1700, the main memory 1704 and the processor 1702 also constituting machine -readable media.
[0090] While the machine -readable medium 1722 is shown in an example embodiment to be a single medium, the term "machine-readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1724 or data structures. The term "non-transitory machine-readable medium" shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform anyone or more of the methodologies of the present subject matter, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such instructions. The term "non-transitory machine -readable medium" shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of non transitory machine -readable media include, but are not limited to, non-volatile memory, including by way of example, semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices), magnetic disks such as internal hard disks and removable disks, magneto -optical disks, and CD- ROM and DVD-ROM disks.
Transmission Medium
[0091] The instructions 1724 may further be transmitted or received over a computer network 1750 using a transmission medium. The instructions 1724 may be transmitted using the network interface device 1720 and any one of a number of well- known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network(WAN), the Internet, mobile telephone networks, Plain Old Telephone Service (POTS)networks, and wireless data networks (e.g., WiFi and WiMAX networks). The term "transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
[0092] As described herein, computer software products can be written in any of various suitable programming languages, such as C, C++, C#, Pascal, Fortran, Perl, Matlab (from Math Works), SAS, SPSS, JavaScript, AJAX, and Java. The computer software product can be an independent application with data input and data display modules. Alternatively, the computer software products can be classes that can be instantiated as distributed objects. The computer software products can also be component software, for example Java Beans or Enterprise JavaBeans. Much functionality described herein can be implemented in computer software, computer hardware, or a combination.
[0093] Furthermore, a computer that is running the previously mentioned computer software can be connected to a network and can interface to other computers using the network. The network can be an intranet, internet, or the Internet, among others. The network can be a wired network (for example, using copper), telephone network, packet network, an optical network (for example, using optical fiber), or a wireless network, or a combination of such networks. For example, data and other information can be passed between the computer and components (or steps) of a system using a wireless network based on a protocol, for example Wi-Fi (IEEE standard 802.11 including its sub-standards a, b, e, g, h, i, n, et al.). In one example, signals from the computer can be transferred, at least in part, wirelessly to components or other computers.
[0094] It is to be understood that although various components are illustrated herein as separate entities, each illustrated component represents a collection of functionalities which can be implemented as software, hardware, firmware or any combination of these. Where a component is implemented as software, it can be implemented as a standalone program, but can also be implemented in other ways, for example as part of a larger program, as a plurality of separate programs, as a kernel loadable module, as one or more device drivers or as one or more statically or dynamically linked libraries.
[0095] It is to be appreciated that FIG. 17 shows an exemplary structure and it could vary. For example, in one embodiment the environment input devices 1720, the user interface navigation device 1714, the disk drive unit 1716, and the signal generation device 1718 are optional and may not be present.
[0096] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstance may suggest or render expedient, but such are intended to cover the application or implementation without departing from the spirit or scope of the claims of the present disclosure.

Claims

CLAIMS I Claim:
1. An effort estimation method for completing a task of a software project assigned to a user possessing one or more software key skills, the method comprising: receiving, by a processor, one or more project attributes and one or more task attributes;
enabling, by the processor, selection of:
a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task, the plurality of proxy tasks associated with a plurality of tasks definitions; and
a complexity of the task based on a plurality of complexity factors associated with the task;
identifying, by the processor, a prior task from a plurality of prior tasks based on:
the one or more attributes associated with the software project, the one or more attributes associated with the task,
the selected proxy task,
the selected complexity of the task, and
the one or more software key skills possessed by the user; and determining, by the processor, an effort for completion of the task based on a prior effort required for completing the prior task.
2. The method of claim 1, further comprising:
monitoring, by the processor, actual effort of the user for completion of the task; and
analyzing, by the processor, a performance metric of the user based on the actual effort and the determined effort for completion of the task.
3. The method of claim 1, wherein the selection of the complexity of the task
comprises assigning a predefined category to the task, the predefined category comprises at least one of low, nominal, high, very high, small, medium, huge, easy and tough, and complex and simple.
4. The method of claim 1, wherein the complexity factors associated with the task includes at least one of a number of data elements, a number of tables, a number of function points, a number of interfaces, a number of software classes, a number of data structures, a number of inputs required, a number of outputs, a number of objects, a number of algorithms, a number of items, a number of test cases and a number of transactions.
5. The method of claim 1, wherein the project attributes associated with the task is at least one of a name of the software project, a role of the user, a software platform used for the project, an application type and a software lifecycle model.
6. The method of claim 1, wherein the task attributes associated with the task is at least one of a task name, a proxy name, a task complexity, a software language an acquisition type, a task start date, and a capability assignment.
7. The method of claim 1, wherein the task is one or more units of work in the software project.
8. The method of claim 1, wherein identifying the prior task comprises applying a machine learning algorithm, by the processor, to select the prior task from the plurality of prior tasks, wherein the machine learning algorithm is a fuzzy logic based algorithm.
9. The method of claim 1, further comprising:
enabling the user to select at least one of: the proxy task, the complexity of the task, a number of project components and a number of task components.
10. An effort estimation system for completing a task of a software project assigned to a user possessing one or more software key skills, the system comprising: at least one memory to store instructions for effort estimation, a plurality of proxy tasks, a plurality of prior tasks, one or more project attributes, one or more task attributes and a plurality of proxy task definitions; and
a processor coupled with the at least one memory, the processor is configured to:
receive one or more attributes associated with the software project and one or more attributes associated with the task;
enable selection of:
a proxy task from a plurality of proxy tasks based on a similarity index of a scope of the task and a definition of the proxy task, the plurality of proxy tasks is associated with a plurality of tasks definitions; and
a complexity of the task based on a plurality of complexity factors associated with the task;
identify a prior task from a plurality of prior tasks based on:
the one or more attributes associated with the software project, the one or more attributes associated with the task,
the selected proxy task,
the selected complexity of the task, and
the one or more software key skills possessed by the user; and determine an effort for completion of the task based on a prior effort required for completing the prior task.
11. The system of claim 10, wherein the processor is further configured to:
monitor actual effort of the user for completion of the task; and analyze a performance metric of the user based on the actual effort and the determined effort for completion of the task.
12. The system of claim 10, wherein the processor is further configured to assign a predefined category to the task, the predefined category includes at least one of low, nominal, high, very high, small, medium, huge, easy and tough, and complex and simple.
13. The system of claim 10, wherein the complexity factors associated with the task includes at least one of a number of data elements, a number of tables, a number of function points, a number of interfaces, a number of software classes, a number of data structures, a number of inputs required, a number of outputs, a number of objects, a number of algorithms, a number of items, a number of test cases and a number of transactions.
14. The system of claim 10, wherein the project attributes associated with the task is at least one of a name of the software project, a role of the user, a software platform used for the project, an application type and a software lifecycle model.
15. The system of claim 10, wherein the task attributes associated with the task is at least one of a task name, a proxy name, a task complexity, a software language an acquisition type, a task start date, and a capability assignment.
16. The system of claim 10, wherein the task is one or more units of work in the software project.
17. The system of claim 10, wherein the processor is further configured to apply a machine learning algorithm, by the processor, to select the prior task from the plurality of task, wherein the machine learning algorithm is a fuzzy logic based algorithm.
18. The system of claim 10, wherein the processor is further configured to enable the user to select at least one of: the proxy task, the complexity of the task, a number of project components and a number of task components.
19. An effort estimation computing device for completing a task of a software
project by a user possessing one or more software key skills, the effort estimation computing device comprising:
an input and output (I/O) device to:
receive one or more software project attributes, one or more task attributes, a proxy task and a complexity of the task; and
a processor to:
identify a prior task from a plurality of prior tasks based on the proxy task, the complexity of the task, the one or more software project attributes associated with the task and the one or more software key skills associated with the user; and
determine an effort for completion of the task based on a prior effort required for completion of the prior task.
20. A non-transitory, computer-readable storage medium storing computer- executable program instructions to implement a method for completing a task of a software project by a user possessing one or more software key skills, the method comprising:
accessing the task to obtain:
a scope of the task,
one or more project attributes,
one or more task attributes, and
complexity factors associated with the task,
identifying a proxy task from a plurality of proxy tasks based on a similarity index of the scope of the task and a definition of the proxy task, the plurality of proxy tasks are associated with a plurality of tasks definitions;
determining a complexity of the task based on the complexity factors associated with the task;
identifying a prior task from a plurality of prior tasks based on:
the proxy task,
the one or more task attributes associated with the task, the one or more project attributes associated with the task, the complexity of the task, and
the one or more software key skills possessed by the user, and determining an effort for completion of the task by the user based on a prior effort required for completing the prior task.
PCT/IN2018/050434 2017-07-03 2018-07-03 Task based estimator and tracker WO2019008600A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201741023404 2017-07-03
IN201741023404 2017-07-03

Publications (1)

Publication Number Publication Date
WO2019008600A1 true WO2019008600A1 (en) 2019-01-10

Family

ID=64949863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2018/050434 WO2019008600A1 (en) 2017-07-03 2018-07-03 Task based estimator and tracker

Country Status (1)

Country Link
WO (1) WO2019008600A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321644B2 (en) 2020-01-22 2022-05-03 International Business Machines Corporation Software developer assignment utilizing contribution based mastery metrics
WO2023206589A1 (en) * 2022-04-30 2023-11-02 Citrix Systems, Inc. Intelligent task management

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005024580A2 (en) * 2003-09-03 2005-03-17 Electronic Data Systems Corporation Computer program for estimating software development effort
US20050114828A1 (en) * 2003-11-26 2005-05-26 International Business Machines Corporation Method and structure for efficient assessment and planning of software project efforts involving existing software

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005024580A2 (en) * 2003-09-03 2005-03-17 Electronic Data Systems Corporation Computer program for estimating software development effort
US20050114828A1 (en) * 2003-11-26 2005-05-26 International Business Machines Corporation Method and structure for efficient assessment and planning of software project efforts involving existing software

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321644B2 (en) 2020-01-22 2022-05-03 International Business Machines Corporation Software developer assignment utilizing contribution based mastery metrics
WO2023206589A1 (en) * 2022-04-30 2023-11-02 Citrix Systems, Inc. Intelligent task management

Similar Documents

Publication Publication Date Title
US11790180B2 (en) Omnichannel data communications system using artificial intelligence (AI) based machine learning and predictive analysis
CN106445652B (en) Method and system for intelligent cloud planning and decommissioning
US9767436B2 (en) System for managing formal mentoring programs
US11748422B2 (en) Digital content security and communications system using artificial intelligence (AI) based machine learning and predictive analysis
AU2019202437A1 (en) Generating a set of user interfaces
US20220292999A1 (en) Real time training
US20140108073A1 (en) System and method for populating assets to a maintenance management system
US20180218330A1 (en) Recommending future career paths based on historic employee data
US20160026347A1 (en) Method, system and device for aggregating data to provide a display in a user interface
US20220270021A1 (en) User-centric system for dynamic scheduling of personalised work plans
US20200134568A1 (en) Cognitive assessment recommendation and evaluation
US20210365856A1 (en) Machine Learning Platform for Dynamic Resource Management
US20160104260A1 (en) Practitioner Career Management Assessment Interviewer Method and Tool
US20210065049A1 (en) Automated data processing based on machine learning
WO2019008600A1 (en) Task based estimator and tracker
WO2016176377A1 (en) Opportunity surfacing machine learning framework
US20220036314A1 (en) System and method for providing employee driven mobility
US10984361B1 (en) Providing a set of social communication channels to a set of client devices
US10789559B2 (en) Virtually assisted task generation
US20120226529A1 (en) Resource availability and applicability mechanism
US11593104B2 (en) Methods and systems for monitoring contributors to software platform development
US20230058543A1 (en) Systems and methods relating to evaluating and measuring an experience using an experience index
US20180218306A1 (en) System, method and computer program product for a cognitive project manager engine
US20110276694A1 (en) Information technology resource management
US20200097870A1 (en) Work task commitment manager

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18827990

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18827990

Country of ref document: EP

Kind code of ref document: A1