EP1226523A4 - Method and estimator for providing operations maturity model assessment - Google Patents

Method and estimator for providing operations maturity model assessment

Info

Publication number
EP1226523A4
EP1226523A4 EP00973433A EP00973433A EP1226523A4 EP 1226523 A4 EP1226523 A4 EP 1226523A4 EP 00973433 A EP00973433 A EP 00973433A EP 00973433 A EP00973433 A EP 00973433A EP 1226523 A4 EP1226523 A4 EP 1226523A4
Authority
EP
European Patent Office
Prior art keywords
omm
assessment
function
recited
estimating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00973433A
Other languages
German (de)
French (fr)
Other versions
EP1226523A1 (en
Inventor
Nancy Skoyles-Greenberg
William C Bond
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Accenture LLP
Original Assignee
Accenture LLP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accenture LLP filed Critical Accenture LLP
Publication of EP1226523A1 publication Critical patent/EP1226523A1/en
Publication of EP1226523A4 publication Critical patent/EP1226523A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/80Management or planning

Definitions

  • IT Information Technology
  • Businesses need to balance technological capability with enterprise capability in order to become, or stay, a modern organization that has a chance of survival.
  • IT framework has to be a single framework for describing such IT management.
  • the IT framework should be a framework of all functions; a representation of a complete checklist of all relevant activities performed in an IT enterprise.
  • a single IT Framework should represent all functions operative in an IT enterprise.
  • OMM Operations Maturity Model
  • OMM assessment is a key function of an operations maturity model. Therefore, to meet this competition, there are needs for improved methods for providing operations maturity model assessment and an estimator for doing so.
  • one embodiment of the invention is a method for providing operations maturity model (OMM) assessment that includes planning, performing, and reporting an OMM assessment function for an IT organization.
  • OMM operations maturity model
  • the providing includes defining capability requirements, assessing current capabilities, identifying and resolving gaps, and developing OM capability blueprints for the OMM assessment function.
  • the defining step may include defining scope and objectives; executing and evaluating OMM questionnaires; obtaining work products and documentation; and creating and conducting OMM kickoff presentations.
  • the assessing step may include scheduling and conducting function interviews; analyzing work products and documentation; following up to solidify data; categorizing data by function; rating base practices; rating generic practices; consolidating data; and preparing function profiles.
  • the identifying and resolving step may include determining continuous improvement initiatives; identifying alternatives; estimating costs of improvements; assessing timing implications; and selecting to continuous improvement initiatives start.
  • the developing step may include prioritizing continuous improvement initiatives; developing capability delivery approach; preparing and presenting final results documentation and presentation; and modifying delivery plans as needed. Another aspect of the present invention is a method for providing an estimate for building an OMM assessment function in an information technology organization.
  • This aspect of the present invention allows an IT consultant to give on-site estimations to a client within minutes.
  • the estimator produces a detailed break down of cost and time to complete a project by displaying the costs and time corresponding to each stage of a project along with each task.
  • Another aspect of the present invention is a computer system for allocating time and computing cost for building a OMM assessment function in an information technology system.
  • FIG 1 shows a representation of the steps in a method for providing an operations maturity model (OMM) assessment system according to the presently preferred embodiment of the invention.
  • OMM operations maturity model
  • Figure 2 shows a representation of the tasks for defining OMM capability requirements for the method represented in Figure 1.
  • Figure 3 shows a representation of the tasks for assessing current capabilities for the method represented in Figure 1.
  • Figure 4 shows a representation of the tasks for identifying and resolving gaps for the method represented in Figure 1.
  • Figure 5 shows a representation of the tasks for developing OM capability blueprints for the method represented in Figure 1.
  • Figure 6 shows a flow chart for obtaining an estimate of cost and time allocation for a project.
  • Figures 7a though 7b show one embodiment of an estimating worksheet for an OM business recovery planning estimating guide.
  • an information technology (“IT”) enterprise may be considered to be a business organization, charitable organization, government organization, etc. that uses an information technology system with or to support its activities.
  • An IT organization is the group and associated systems and processes within the enterprise that are responsible for the management and delivery of information technology services to users in the enterprise.
  • multiple functions may be organized and categorized to provide comprehensive service to the user.
  • the various operations management functionalities within the IT framework include a customer service management function; a service integration function; a service delivery function; a capability development function; a change administration function; a strategy, architecture, and planning function; a management and administration function; a human performance management function; and a governance and strategic relationships function.
  • the complexity of the business environment demands that a company have a formal way of assessing its IT capabilities, as well as a specific and measurable path for improving those capabilities.
  • Management and administration provides the framework for effectively managing information technology enterprises using sound business principles and practices. Management and administration manages functions that are not always unique to the information technology portion of the enterprise; therefore, these functions are often performed outside of the information technology group.
  • management and administration include financial administration, quality administration, asset management, vendor management, facilities, regulatory compliance, and communications.
  • Quality management is a function within management and administration. Quality management monitors, across the enterprise, how well the IT environment is being managed and works towards continual improvement of IT capabilities and services. Quality management ensures that quality is put into every aspect of IT throughout the enterprise. Functions within quality management include tasks for the quality plan, quality metrics, external benchmarking, quality assurance review, and continuous improvement planning.
  • the present invention includes a method for providing an OMM assessment function and an estimator useful for determining the times and cost to provide such a function.
  • the OMM provides the basis for IT organizations to gauge performance, and will assist in planning and tracking improvements to the IT operations environment.
  • Operations Maturity is the extent that the organization's processes are explicitly defined, managed, measured, controlled, and effective, and the consistency with which it is applied through out the operations environment.
  • the operations environment dimension is characterized by a set of processes. Each process has a measurable purpose statement, which describes what has to be achieved in order to attain the defined purpose of the process.
  • the operations environment is partitioned into three elements: Process Categories, Functions and Base Practices.
  • the framework provides a basis for defining an objective improvement strategy in line with an IT organization's needs, priorities, and resource availability.
  • the OMM further provides a method for determining the overall operations maturity of an IT organization based on the quality and institutionalization of its processes.
  • the OMM can thus be used by IT organizations in a variety of contexts.
  • An IT organization can use the model to assess and improve its own processes.
  • An IT organization can also use the model to assess the capability of suppliers in meeting their commitments, and hence better manage the risk associated with outsourcing and sub-contract management.
  • the model can be used to focus on an entire IT organization, on a single functional area such as service management, or on a more focused area such as a problem management.
  • the assessment process is used to appraise an organization's IT operations environment process capability.
  • the objective of the assessment is to identify the differences and the gaps between the actual implementations of the processes in the assessed IT operations organization with respect to the OMM. Defining a reference model ensures that results of assessments can be reported in a common context and provides the basis on which comparisons can be based.
  • An IT organization can perform an assessment for a variety of reasons.
  • An assessment can be performed in order to assess the processes in the operations environment with the purpose of improving work and service processes.
  • An IT organization can also perform an assessment to determine and better manage the risks associated with outsourcing.
  • an assessment can be performed to determine if the IT organization is capable of supporting a new application or technology.
  • Three phases are defined in the assessment model: planning the assessment, performing the assessment, and reporting the assessment results. All phases of the assessment are performed using a team-based approach. Team members include the OMM sponsor, the assessment team lead, assessment team members, and IT operations personnel.
  • a rating is a characterization of an IT organization's operations processes relative to a component of the OMM.
  • a base practice is an essential activity that an organization performs to achieve the purpose of a Function.
  • a base practice is described in specific terms; it is what an organization does.
  • a generic practice is an activity that contributes to the capability of managing and improving the effectiveness of the operations environment Functions in achieving their purposes through the base practices.
  • a Generic Practice is applicable to all Functions and contributes to overall process management, measurement and institutionalization capability of the
  • Work products describes evidence of base practice implementation. For example, a completed change control request and / or a resolved trouble ticket.
  • Process attributes are features of a process that can be evaluated on a scale of achievement (performed, partially performed, not performed, etc.) which provide a measure of the capability of the process.
  • a category has a defined purpose and measurable goals and consists of logically related set of Sets that collectively address the purpose and goals, in the same general area of activity.
  • Network Centric Environment For the purpose of the present invention, the term "network centric" (or, netcenthc) should be construed to cover various means of reaching out to customers and partners with computing systems and knowledge over a communications backbone, such as an intranet, extranet, or internet connection. It is valuable to have an understanding of a netcentric environment since carrying out the method of providing OMM assessment within this environment may take special considerations.
  • Application logic is preferably packaged into components and distributed from a server to a client over a network connection between the client and server.
  • the client has standardized interfaces so that an application can execute with a client that can run on multiple operating systems and hardware platforms.
  • the application components of the preferred netcentric computing system enable the netcentric computing systems to be adaptable to a variety of distribution styles, from a "thin client" to a "fat client.”
  • Netcentric frameworks preferably support a style of computing where processes on different machines communicate using messages.
  • client processes delegate business functions or other tasks
  • Server processes respond to messages from clients.
  • Business logic can reside on both the client and server.
  • Clients are typically personal computers (PC's) or workstations with a graphical user interface running a web browser.
  • Servers are preferentially implemented on UNIX, NT, or mainframe machines.
  • netcentric computing systems there is a preferred tendency to move more business logic to the servers, although "fatter" clients result from new technologies such as Java and ActiveX.
  • technology, people, and processes may be distributed across global boundaries and business functions/systems may involve multiple organizations. This will generally add complexity to the required systems.
  • the "planning stage” includes the step of Defining OMM
  • the "performing stage” includes the steps of Assessing Current Capabilities 1540 and Identifying and Resolving Gaps 1570.
  • the “reporting stage” includes the step of Developing OM Capability Blueprints 1580. In the following, the details of the tasks within each step are discussed.
  • Step 1520 - Defining OMM Capability Requirements the assessment project parameters are defined, and the data gathering phase of the work is begun.
  • the stakeholders in the sponsoring organization are oriented on the benefits, limitations, and approach to be used in the assessment.
  • Figure 2 shows a representation of the tasks for carrying out these functions according to the presently preferred embodiment of the invention. These tasks include Defining Scope and
  • Objectives 1521 Executing and Evaluating OMM Questionnaires 1523, Obtaining Work Products and Documentation 1525, and Creating and Conducting OMM Kickoff Presentations 1527.
  • the products of this step include a category & function assessment list, OMM questionnaire evaluation, operations documentation, and OMM kickoff presentation.
  • Task 1521 Defining Scope and Objectives
  • Task 1521 includes provision of scope and goals for the assessment that are agreeable to both the assessment team lead and the sponsoring organization's assessment coordinator. Key issues to be resolved include: what functions are to be analyzed by the project team; what OMM capability levels are to be considered in the analysis; what depth of analysis of Continuous Improvement (Cl) Initiatives is desired; what limitations are there on Cl Initiatives (for example, 10 functions may be included in the scope, but the organization may only want Cl recommendations for the 5 considered the most deficient); and what level of ratings is desired (function only vs. overall maturity).
  • Cl Continuous Improvement
  • An assessment plan is developed based on the goals identified by the sponsoring organization.
  • the plan consists of detailed schedules for the assessment and potential risks identified with performing the assessment.
  • Assessment team members, assessment participants, and areas to be assessed are selected.
  • Work products are identified for initial review, and the logistics for the on-site visit are identified and planned.
  • the assessment team members will preferably receive adequate training on the OMM framework and the assessment process to ensure that they will have the ability to interpret the data obtained.
  • the team will preferably have a comprehensive understanding of the assessment process, its underlying principles, the tasks necessary to execute it, and their role in performing the tasks.
  • the team will preferably fully understand the Rating Framework in order to provide an objective rating of the areas assessed.
  • the "planning stage” then progresses to the tasks of executing and evaluating OMM questionnaires 1523 and obtaining work products and documentation 1525.
  • Task 1523 Executing and Evaluating OMM Questionnaires Task 1523 includes distribution and interpretation of maturity questionnaires.
  • the maturity questionnaire is a set of questions about the operations environment that sample the base practices in each Function of the OMM. Maturity questionnaires exist for each Function of the OMM, and tie back to base practices, process attributes, and generic practices.
  • the questionnaires are used to obtain information on the capability of the IT organization or a specific IT area or project, and are distributed to OMM participants prior to the on-site visit.
  • Completed questionnaires provide the assessment team with an overview of the IT operations process capability of the IT organization. The responses assist the team in focusing their investigations, and provide direction for later activities such as interviews 1541 and document reviews 1542 and 1543.
  • Assessment team members prepare exploratory questions based on OMM Interview Guides and responses to the maturity questionnaires.
  • Task 1525 Obtaining Work Products and Documentation Task 1525 includes gathering evidence and documents relating to IT operations.
  • Assessment team members prepare exploratory questions based on responses to the maturity questionnaires and on OMM Interview Guides.
  • Interview Guides are a set of exploratory questions about the operations environment which are used during the interview process to obtain more detailed information on the capability of the IT organization.
  • the interview aids are used by the assessment team to guide them through interview sessions 1541 with assessment participants.
  • OMM participants will also receive a Work Product List.
  • assessment participants prepare documentation for the assessment team members to review. Documentation about the IT operations functions allows the assessment team to tie IT organization data to the OMM.
  • Task 1527 Creating and Conducting OMM Kickoff Presentations
  • Task 1527 includes the presentation of an inaugural meeting to begin the "performing stage". Once the information from the questionnaires 1523 and from work products and documentation 1525 has been analyzed, a Kickoff meeting is scheduled at the start of the on-site activities. The purpose of the meeting is to provide the participants with an overview of OMM and the assessment process, to set expectations, and to answer any questions about the process. The OMM sponsor of the assessment should participate in the presentation to show visible support and stress the importance of the assessment process to everyone involved.
  • Step 1540 Assessing Current Capabilities
  • step 1540 current capabilities, including their strengths and weaknesses are analyzed.
  • the practices ratings are used to develop profiles for each selected operations function. These tasks include Scheduling and
  • Task 1541 Scheduling and Conducting Function Interviews Task 1541 includes interviews of those involved in pertinent functions. Interviewing provides an opportunity to gain a deeper understanding of the activities performed, how the work is performed, and the processes currently in use. Interviewing provides the assessment team members with identifiable assessment indicators for each Function appraised. Interviewing also provides the opportunity to address all areas of OMM within the scope of the assessment.
  • IT operations managers and supervisors are interviewed as a group in order to understand their view of how the work is performed in the IT organization, any problem areas of which they are aware, and improvements that they feel need to be made. IT operations personnel are interviewed to collect data within the scope of the assessment and to identify areas that they can and should improve in the IT organization.
  • Task 1542 includes confirming and supplementing the information from the interviews with other sources.
  • Data for the OMM assessment are obtained from several sources: responses to the maturity questionnaires, interview sessions, and work products and document reviews 1525.
  • Task 1543 includes confirming the accuracy and relevance of the information obtained in the "planning stage” 1520, in the interviews 1541 , and in the work products and documentation 1542.
  • the purpose of this activity is to summarize and consolidate information into a manageable set of findings.
  • Task 1545 Categorizing Data by Function
  • Task 1545 includes the organization of all relevant data as it pertains to specific IT functions.
  • the data is categorized into the appropriate areas of the OMM.
  • the assessment team will preferably reach consensus on the validity of the data and whether sufficient information in the areas evaluated has been collected. It is the team's responsibility to obtain sufficient information on the OMM components within the scope of the assessment for the required areas of the IT organization before any rating can be done.
  • follow-up interviews may occur for clarification.
  • Initial findings are generated from the information collected thus far and presented to the assessment participants.
  • the purpose of presenting initial findings is to obtain feedback from the individuals who provided information during the various interviews. Ratings are not considered until after the initial findings presentations, as the assessment team is still collecting data. Feedback is recorded for the team to consider at the conclusion of all of the initial findings presentations.
  • Tasks 1546 and 1547 Rating Base Practices and Generic Practices Tasks1546 and 1547 include the assignment of a rating to the generic practices and base practices. This is done by reviewing questionnaire responses, results, interview notes, documentation, and the Assessment Indicator Rating template.
  • the Assessment Indicators are objective attributes or characteristics of a practice or work product that supports an assessor's judgment of performance of an implemented process. The assessment team will use the scoring matrix guideline provided by the OMM framework.
  • Task 1548 includes the assignment of a rating to each Process Attribute.
  • process attributes are rated based on the existence of and compliance to base practices.
  • process attribute are rated compliance to generic practices.
  • Each process attribute will receive a rating of Not Achieved, Partially Achieved, Largely Achieved or Fully Achieved.
  • the method used to rate base practices and generic practices is the Analytical Hierarchy Process (AHP) method.
  • AHP Analytical Hierarchy Process
  • AHP Analytical Hierarchy Process
  • AHP is a comprehensive, logical, and structural framework, which is used to improve the understanding of complex decisions by decomposing the problem in a hierarchical structure. The incorporation of all relevant decision criteria, and their pairwise comparison, allows the decision maker to determine the trade-offs among objectives.
  • Task 1549 Preparing Function Profiles Task 1549 includes the assignment of a rating (N,P,L,F) to each
  • AHP method is used to roll up the Category ratings into an organizational maturity level rating.
  • Step 1570 Identifying and Resolving Gaps
  • step 1570 a gap closure approach for specified functions is developed, including estimates of cost and timing.
  • Figure 4 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the tasks include Determining Continuous Improvement (Cl) Activities 1571 , Identifying Alternatives 1573, Estimating Costs of Improvements 1575, Assessing Timing Implications 1577, and Selecting Cl Initiatives to Start 1579.
  • the products of this step include Continuous Improvement (Cl) initiatives.
  • Task 1571 Determining Continuous Improvement (Cl) Activities Task 1571 includes identification of possible Cl Initiatives.
  • a Rating Tool generates a list of areas for improvement based on the activities needed to move to the next capability level. Identification of additional Cl Initiatives may or may not be included in the scope of an OMM Assessment project. If it is not, then this task package would be eliminated from the work plan.
  • Task 1573 includes identification of alternate Cl Initiatives. Cl alternatives should consider organization, performance, and process improvement opportunities in addition to software solutions that may be available. The effort put into analyzing, estimating, and documenting the choices should be controlled by the scope authorized by the sponsoring organization at the outset of the project. Some organizations may only wish to know what is available, others may be ready to begin the improvement process immediately and want cost estimates, work plans, staffing requirements and time schedules for priority initiatives.
  • the provider or manager then estimates costs of the alternatives 1575 and any impact that the alternatives and choices might have on the organization's functioning or the timing of the implementation 1577.
  • the organization selects the continuous improvement activities it wishes to pursue 1579. This concludes the "performing stage" of the project.
  • the "reporting stage” the OMM Capability Blueprints are constructed by function 1580.
  • Step 1580 Defining OMM Capability Blueprints
  • step 1580 the final project documentation is prepared, a management report is delivered, and the stage is set for subsequent delivery efforts.
  • Figure 5 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention.
  • the tasks include Prioritizing Cl Initiatives 1581 , Developing OMM Capability
  • Delivery Approach and Cl Plans 1583 Preparing and Presenting Final Results Documents and Presentation 1587, and Modifying Delivery Plans as Needed 1589.
  • the products of this step include OMM assessment report and final results presentation.
  • the selected continuous improvement activities are prioritized 1581 and an approach is developed for these activities 1583.
  • a report or presentation is prepared and presented to the organization's management 1587, and any modifications desired are made 1589 before deployment.
  • the final assessment results are presented to the OMM sponsor 1587.
  • the sponsor owns the assessment results and is free to use them as he or she sees fit.
  • the assessment team preferably ensures that the IT organization understands the issues that were discovered during the assessment and the key issues that it faces. Operational strengths are presented to validate what the IT organization is doing well. Strengths and weaknesses are presented for each area within the assessment scope as well as any non-OMM issues that affect process. A profile is presented showing the ratings for each specific area assessed.
  • An executive overview session is held in order to allow the senior IT Operations manager to clarify any issues with the assessment team, to confirm his or her understanding of the operations process issues, and to gain full understanding of the recommendations report.
  • the assessment team collects feedback from the assessment participants and the assessment team on the process, packages information that needs to be saved for historical purposes. Any modifications desired are made 1589 before deployment.
  • the present invention also includes a method and apparatus for providing an estimate for building the operations management maturity assessment function for an information technology organization.
  • the method and apparatus generate a preliminary work estimate (time by task) and financial estimate (dollars by classification) based on input of a set of estimating factors that identify the scope and difficulty of key aspects to the function.
  • Figure 6 is a flow chart of one embodiment of a method for providing an estimate of the time and cost to build a business recovery planning function in an information technology organization.
  • a provider of a business recovery planning function such as an IT consultant, for example Andersen Consulting, obtains estimating factors from the client 202. This is a combined effort, with the provider adding expertise and knowledge to help in determining the quantity and difficulty of each factor.
  • Estimating factors represent key business drivers for a given OM function. Table 1 lists and defines the factors to be considered along with examples of a quantity and difficulty rating for each factor.
  • the provider determines an estimating factor 202 , such as for the number of Cl Initiative, with the help of the client.
  • the difficulty rating 204 is determined. Each of these determinations depends on the previous experience of the consultant. The provider or consultant with a high level of experience will have a greater likelihood of determining the correct number and difficulty ratings.
  • the number and difficulty ratings are input into a computer program.
  • the computer program is a spreadsheet, such as EXCEL, by Microsoft Corp. of Redmond, Washington, USA.
  • the consultant and the client will continue to determine the number and difficulty ratings for each of the remaining estimating factors 206.
  • this information is transferred to an assumption sheet 208, and the assumptions for each factor are defined.
  • the assumption sheet 208 allows the consultant to enter comments relating to each estimating factor and to document the underlying reasoning for a specific estimating factor.
  • an estimating worksheet is generated and reviewed 210 by the consultant, client, or both.
  • An example of a worksheet is shown in Figures 7a- b.
  • the default estimates of the time required for each task will populate the worksheet, with time estimates based on the number factors and difficulty rating previously assigned to the estimating factors that correspond to each task.
  • the amount of time per task is based on a predetermined time per unit required for the estimating factor multiplied by a factor corresponding to the level of difficulty.
  • Each task listed on the worksheet is described above in connection with details of the method for providing the business recovery planning function.
  • the same numbers in the description of the method above correspond to the same steps, tasks, and task packages of activities shown on the worksheet of Figures 7a-b.
  • the worksheet is reviewed 210 by the provider and the client for accuracy.
  • Adjustments can be made to task level estimates by either returning to the factors sheet and adjusting the units 212 or by entering an override estimate in the 'Used' column 214 on the worksheet. This override may be used when the estimating factor produces a task estimate that is not appropriate for the task, for example, when a task is not required on a particular project.
  • the provider and the client review and adjust, if necessary, the personnel time staffing factors for allocations 216 for the seniority levels of personnel needed for the project. Referring to Figures 7a-b, these columns are designated as Partner - "Ptnr", Manager - "Mgr”, Consultant - “Cnslt”, and Analyst - "Anlst”, respectively. These allocations are adjusted to meet project requirements and are typically based on experience with delivering various stages of a project. It should be noted that the staffing factors should add up to 1.
  • the consultant or provider and the client then review the work plan 218, and may optionally include labor to be provided by the client.
  • the work plan contains the total time required in days per stage and in days per task required to complete the project. Tasks may be aggregated into a "task package" of subtasks or activities for convenience.
  • a worksheet, as shown in Figures 7a-b, may also be used for convenience. This worksheet may be used to adjust tasks or times as desired, from the experience of the provider, the customer, or both.
  • a financial estimate is generated in which the provider and client enter the agreed upon billing rates for Ptnr, Mgr, Cnslt, and Anlst 220.
  • the total estimated payroll cost for the project will then be computed and displayed, generating final estimates.
  • a determination of out-of-pocket expenses 222 may then be applied to the final estimates to determine a final project cost 224.
  • the provider will review the final estimates with an internal functional expert 226.
  • project management costs for managing the provider's work are included in the estimator. These are task dependant and usually run between 10 and 15% of the tasks being managed, depending on the level of difficulty. These management allocations may appear on the worksheet and work plan.
  • the time allocations for planning and managing a project are typically broken down for each of a plurality of task packages where the task packages are planning project execution 920, organizing project resources 940, controlling project work 960, and completing project 990, as shown in Figure 6.

Abstract

A method of providing an OMM assessment function of an information technology organization includes defining capability requirements (1520), assessing current capabilities (1540), identifying and resolving gaps (1570) and developing operations maturity blueprints (1580) for the OMM assessment function. A method to estimate the time and cost for providing the OMM assessment function includes obtaining estimating factors (202), obtaining difficulty ratings (204), generating time allocations (216) and generating costs (224).

Description

METHOD AND ESTIMATOR FOR PROVIDING OPERATIONS MATURITY MODEL ASSESSMENT
RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application 60/158,259, filed October 6, 1999. This application is related to Application
Serial No. entitled Organization Of Information Technology
Functions", by Dove et al. (Attorney docket No. 10022/45), filed herewith. These applications are incorporated herein by reference in their entirety.
BACKGROUND OF THE INVENTION The biggest challenges in Information Technology ("IT") development today are actually not in the technologies, but in the management of those technologies in a complex business environment. From idea conception to capability delivery and to operation, all IT activities, including strategy development, planning, administration, coordination of project requests, change administration, and managing demand for discretionary and non- discretionary activities and operations, must be collectively managed. A shared understanding and representation of IT management is needed because today's technological and business environment demands it. The new technological management orientation should include ways for planning, assessing, and deploying technology within and across enterprises.
Businesses need to balance technological capability with enterprise capability in order to become, or stay, a modern organization that has a chance of survival.
There is a need, therefore, to construct a complete yet simple IT framework that would quickly convey the entire scope of IT capability in a functional decomposition. Such IT framework has to be a single framework for describing such IT management. The IT framework should be a framework of all functions; a representation of a complete checklist of all relevant activities performed in an IT enterprise. A single IT Framework should represent all functions operative in an IT enterprise. Within that IT Framework, there is also a need for an Operations Maturity Model (OMM) assessment to appraise an organization's IT operations environment process capability. By marketing current IT service offerings, increasing customer satisfaction, and building stronger customer relationships, the IT organization can better service their business customer.
An operations maturity model assessment capability becomes critical to the IT organization as competition to provide IT services is beginning to increase from outsourcers. OMM assessment is a key function of an operations maturity model. Therefore, to meet this competition, there are needs for improved methods for providing operations maturity model assessment and an estimator for doing so.
BRIEF SUMMARY OF THE INVENTION
The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims. By way of introduction, one embodiment of the invention is a method for providing operations maturity model (OMM) assessment that includes planning, performing, and reporting an OMM assessment function for an IT organization.
In one aspect of the preferred embodiment, the providing includes defining capability requirements, assessing current capabilities, identifying and resolving gaps, and developing OM capability blueprints for the OMM assessment function.
In another aspect of the preferred embodiment, the defining step may include defining scope and objectives; executing and evaluating OMM questionnaires; obtaining work products and documentation; and creating and conducting OMM kickoff presentations.
In another aspect of the preferred embodiment, the assessing step may include scheduling and conducting function interviews; analyzing work products and documentation; following up to solidify data; categorizing data by function; rating base practices; rating generic practices; consolidating data; and preparing function profiles. In another aspect of the preferred embodiment the identifying and resolving step may include determining continuous improvement initiatives; identifying alternatives; estimating costs of improvements; assessing timing implications; and selecting to continuous improvement initiatives start. In another aspect of the preferred embodiment the developing step may include prioritizing continuous improvement initiatives; developing capability delivery approach; preparing and presenting final results documentation and presentation; and modifying delivery plans as needed. Another aspect of the present invention is a method for providing an estimate for building an OMM assessment function in an information technology organization. This aspect of the present invention allows an IT consultant to give on-site estimations to a client within minutes. The estimator produces a detailed break down of cost and time to complete a project by displaying the costs and time corresponding to each stage of a project along with each task. Another aspect of the present invention is a computer system for allocating time and computing cost for building a OMM assessment function in an information technology system.
These and other features and advantages of the invention will become apparent upon review of the following detailed description of the presently preferred embodiments of the invention, taken in conjunction with the appended drawings.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
The present invention is illustrated by way of example and not limitation in the accompanying figures. In the figures, like reference numbers indicate identical or functionally similar elements.
Figure 1 shows a representation of the steps in a method for providing an operations maturity model (OMM) assessment system according to the presently preferred embodiment of the invention.
Figure 2 shows a representation of the tasks for defining OMM capability requirements for the method represented in Figure 1. Figure 3 shows a representation of the tasks for assessing current capabilities for the method represented in Figure 1.
Figure 4 shows a representation of the tasks for identifying and resolving gaps for the method represented in Figure 1. Figure 5 shows a representation of the tasks for developing OM capability blueprints for the method represented in Figure 1.
Figure 6 shows a flow chart for obtaining an estimate of cost and time allocation for a project.
Figures 7a though 7b show one embodiment of an estimating worksheet for an OM business recovery planning estimating guide.
DETAILED DESCRIPTION OF THE INVENTION
For the purposes of this invention, an information technology ("IT") enterprise may be considered to be a business organization, charitable organization, government organization, etc. that uses an information technology system with or to support its activities. An IT organization is the group and associated systems and processes within the enterprise that are responsible for the management and delivery of information technology services to users in the enterprise. In a modern IT enterprise, multiple functions may be organized and categorized to provide comprehensive service to the user. Thereby, an information technology framework for understanding the interrelationships of the various functionalities, and for managing the complex IT organization is provided.
The various operations management functionalities within the IT framework include a customer service management function; a service integration function; a service delivery function; a capability development function; a change administration function; a strategy, architecture, and planning function; a management and administration function; a human performance management function; and a governance and strategic relationships function. The complexity of the business environment demands that a company have a formal way of assessing its IT capabilities, as well as a specific and measurable path for improving those capabilities. Management and administration provides the framework for effectively managing information technology enterprises using sound business principles and practices. Management and administration manages functions that are not always unique to the information technology portion of the enterprise; therefore, these functions are often performed outside of the information technology group. In some enterprises, some of these non-technology management and administrative functions are performed by separate departments within the larger enterprise (i.e., a separate finance organization), while in other enterprises (especially in outsourcing arrangements and very large information technology enterprises), the information technology enterprise itself may be partially or fully responsible for these functions. The functions within management and administration include financial administration, quality administration, asset management, vendor management, facilities, regulatory compliance, and communications. Quality management is a function within management and administration. Quality management monitors, across the enterprise, how well the IT environment is being managed and works towards continual improvement of IT capabilities and services. Quality management ensures that quality is put into every aspect of IT throughout the enterprise. Functions within quality management include tasks for the quality plan, quality metrics, external benchmarking, quality assurance review, and continuous improvement planning.
In a company-wide initiative to address these capabilities, Andersen Consulting developed and used the Management of the Distributed Environment (MODE) framework and its gap analysis to capture the best practices of IT management and to determine areas of improvement. MODE is a framework for identifying the tools and procedures required to manage a distributed environment. More recently, Andersen Consulting has taken a broader view of the IT industry by incorporating MODE into the IT Framework. While the IT Framework and the gap analysis is intended to capture weaknesses in processes that are observable, it does not provide data with sufficient granularity upon which a comprehensive improvement plan can be built. The Operation Maturity Model (OMM) is intended to add further objectivity and consistency to the gap analysis by increasing the requirements of data capturing and data analysis. This added formalism will make the gap analysis conceptually similar to capability assessment approaches, such as the Software Engineering Institute's (SEI) software CMM, or the International
Organization for Standards and the International Electromechanical Commission's SPICE models. The present invention includes a method for providing an OMM assessment function and an estimator useful for determining the times and cost to provide such a function. Before describing the method for providing OMM assessment, some related terms are first described as follows:
Operation Maturity Model (OMM):
The OMM provides the basis for IT organizations to gauge performance, and will assist in planning and tracking improvements to the IT operations environment. Operations Maturity is the extent that the organization's processes are explicitly defined, managed, measured, controlled, and effective, and the consistency with which it is applied through out the operations environment. The operations environment dimension is characterized by a set of processes. Each process has a measurable purpose statement, which describes what has to be achieved in order to attain the defined purpose of the process. The operations environment is partitioned into three elements: Process Categories, Functions and Base Practices.
The framework provides a basis for defining an objective improvement strategy in line with an IT organization's needs, priorities, and resource availability. The OMM further provides a method for determining the overall operations maturity of an IT organization based on the quality and institutionalization of its processes. The OMM can thus be used by IT organizations in a variety of contexts. An IT organization can use the model to assess and improve its own processes. An IT organization can also use the model to assess the capability of suppliers in meeting their commitments, and hence better manage the risk associated with outsourcing and sub-contract management. In addition, the model can be used to focus on an entire IT organization, on a single functional area such as service management, or on a more focused area such as a problem management.
Assessment Process:
The assessment process is used to appraise an organization's IT operations environment process capability. The objective of the assessment is to identify the differences and the gaps between the actual implementations of the processes in the assessed IT operations organization with respect to the OMM. Defining a reference model ensures that results of assessments can be reported in a common context and provides the basis on which comparisons can be based.
An IT organization can perform an assessment for a variety of reasons. An assessment can be performed in order to assess the processes in the operations environment with the purpose of improving work and service processes. An IT organization can also perform an assessment to determine and better manage the risks associated with outsourcing. In addition, an assessment can be performed to determine if the IT organization is capable of supporting a new application or technology. Three phases are defined in the assessment model: planning the assessment, performing the assessment, and reporting the assessment results. All phases of the assessment are performed using a team-based approach. Team members include the OMM sponsor, the assessment team lead, assessment team members, and IT operations personnel.
Rating Framework:
A rating is a characterization of an IT organization's operations processes relative to a component of the OMM. Base Practices:
A base practice is an essential activity that an organization performs to achieve the purpose of a Function. A base practice is described in specific terms; it is what an organization does.
Generic Practices:
A generic practice is an activity that contributes to the capability of managing and improving the effectiveness of the operations environment Functions in achieving their purposes through the base practices. A Generic Practice is applicable to all Functions and contributes to overall process management, measurement and institutionalization capability of the
Functions.
Work Products:
Work products describes evidence of base practice implementation. For example, a completed change control request and / or a resolved trouble ticket.
Process Attributes:
Process attributes are features of a process that can be evaluated on a scale of achievement (performed, partially performed, not performed, etc.) which provide a measure of the capability of the process.
Category:
A category has a defined purpose and measurable goals and consists of logically related set of Sets that collectively address the purpose and goals, in the same general area of activity.
Network Centric Environment: For the purpose of the present invention, the term "network centric" (or, netcenthc) should be construed to cover various means of reaching out to customers and partners with computing systems and knowledge over a communications backbone, such as an intranet, extranet, or internet connection. It is valuable to have an understanding of a netcentric environment since carrying out the method of providing OMM assessment within this environment may take special considerations.
To define netcentric properly, it is helpful to have a general understanding of a framework that describes the types of applications required in a netcentric computing system. Application logic is preferably packaged into components and distributed from a server to a client over a network connection between the client and server. The client has standardized interfaces so that an application can execute with a client that can run on multiple operating systems and hardware platforms. Further, the application components of the preferred netcentric computing system enable the netcentric computing systems to be adaptable to a variety of distribution styles, from a "thin client" to a "fat client."
Netcentric frameworks preferably support a style of computing where processes on different machines communicate using messages. In this style of computing, "client" processes delegate business functions or other tasks
(such as data manipulation logic) to one or more server processes. Server processes respond to messages from clients. Business logic can reside on both the client and server. Clients are typically personal computers (PC's) or workstations with a graphical user interface running a web browser. Servers are preferentially implemented on UNIX, NT, or mainframe machines. In netcentric computing systems, there is a preferred tendency to move more business logic to the servers, although "fatter" clients result from new technologies such as Java and ActiveX. In a netcentric environment, technology, people, and processes may be distributed across global boundaries and business functions/systems may involve multiple organizations. This will generally add complexity to the required systems.
As shown in Figure 1 four steps combine to provide the OMM assessment. It may be helpful to consider the steps as being grouped into three stages. The "planning stage" includes the step of Defining OMM
Capability Requirements 1520. The "performing stage" includes the steps of Assessing Current Capabilities 1540 and Identifying and Resolving Gaps 1570. The "reporting stage" includes the step of Developing OM Capability Blueprints 1580. In the following, the details of the tasks within each step are discussed.
Step 1520 - Defining OMM Capability Requirements In step 1520, the assessment project parameters are defined, and the data gathering phase of the work is begun. The stakeholders in the sponsoring organization are oriented on the benefits, limitations, and approach to be used in the assessment. Figure 2 shows a representation of the tasks for carrying out these functions according to the presently preferred embodiment of the invention. These tasks include Defining Scope and
Objectives 1521 , Executing and Evaluating OMM Questionnaires 1523, Obtaining Work Products and Documentation 1525, and Creating and Conducting OMM Kickoff Presentations 1527. The products of this step include a category & function assessment list, OMM questionnaire evaluation, operations documentation, and OMM kickoff presentation.
Task 1521 : Defining Scope and Objectives
Task 1521 includes provision of scope and goals for the assessment that are agreeable to both the assessment team lead and the sponsoring organization's assessment coordinator. Key issues to be resolved include: what functions are to be analyzed by the project team; what OMM capability levels are to be considered in the analysis; what depth of analysis of Continuous Improvement (Cl) Initiatives is desired; what limitations are there on Cl Initiatives (for example, 10 functions may be included in the scope, but the organization may only want Cl recommendations for the 5 considered the most deficient); and what level of ratings is desired (function only vs. overall maturity). Once agreement is reached, the assessment team lead ensures that the IT operations functions selected are sufficient to meet the purpose and will provide output that is representative of the assessment scope.
An assessment plan is developed based on the goals identified by the sponsoring organization. The plan consists of detailed schedules for the assessment and potential risks identified with performing the assessment. Assessment team members, assessment participants, and areas to be assessed are selected. Work products are identified for initial review, and the logistics for the on-site visit are identified and planned. The assessment team members will preferably receive adequate training on the OMM framework and the assessment process to ensure that they will have the ability to interpret the data obtained. The team will preferably have a comprehensive understanding of the assessment process, its underlying principles, the tasks necessary to execute it, and their role in performing the tasks. In addition, the team will preferably fully understand the Rating Framework in order to provide an objective rating of the areas assessed. The "planning stage" then progresses to the tasks of executing and evaluating OMM questionnaires 1523 and obtaining work products and documentation 1525.
Task 1523: Executing and Evaluating OMM Questionnaires Task 1523 includes distribution and interpretation of maturity questionnaires. The maturity questionnaire is a set of questions about the operations environment that sample the base practices in each Function of the OMM. Maturity questionnaires exist for each Function of the OMM, and tie back to base practices, process attributes, and generic practices. The questionnaires are used to obtain information on the capability of the IT organization or a specific IT area or project, and are distributed to OMM participants prior to the on-site visit. Completed questionnaires provide the assessment team with an overview of the IT operations process capability of the IT organization. The responses assist the team in focusing their investigations, and provide direction for later activities such as interviews 1541 and document reviews 1542 and 1543. Assessment team members prepare exploratory questions based on OMM Interview Guides and responses to the maturity questionnaires.
Task 1525: Obtaining Work Products and Documentation Task 1525 includes gathering evidence and documents relating to IT operations. Assessment team members prepare exploratory questions based on responses to the maturity questionnaires and on OMM Interview Guides. Interview Guides are a set of exploratory questions about the operations environment which are used during the interview process to obtain more detailed information on the capability of the IT organization. The interview aids are used by the assessment team to guide them through interview sessions 1541 with assessment participants. OMM participants will also receive a Work Product List. In response to the list of work products, assessment participants prepare documentation for the assessment team members to review. Documentation about the IT operations functions allows the assessment team to tie IT organization data to the OMM.
Task 1527: Creating and Conducting OMM Kickoff Presentations
Task 1527 includes the presentation of an inaugural meeting to begin the "performing stage". Once the information from the questionnaires 1523 and from work products and documentation 1525 has been analyzed, a Kickoff meeting is scheduled at the start of the on-site activities. The purpose of the meeting is to provide the participants with an overview of OMM and the assessment process, to set expectations, and to answer any questions about the process. The OMM sponsor of the assessment should participate in the presentation to show visible support and stress the importance of the assessment process to everyone involved.
In the "performing stage" of the assessment process, current capabilities are assessed 1540, and gaps are identified and resolved 1570.
Step 1540 - Assessing Current Capabilities
In step 1540, current capabilities, including their strengths and weaknesses are analyzed. The practices ratings are used to develop profiles for each selected operations function. These tasks include Scheduling and
Conducting Function Interviews 1541 , Analyzing Work Products and Documentation 1542, Following Up to Solidify Data 1543, Categorizing Data by Function 1545, Rating Base Practices 1546, Rating Generic Practices 1547, Consolidating Data 1548, and Preparing Function Profiles 1549. The products of this step include function profiles and function documentation. Task 1541 : Scheduling and Conducting Function Interviews Task 1541 includes interviews of those involved in pertinent functions. Interviewing provides an opportunity to gain a deeper understanding of the activities performed, how the work is performed, and the processes currently in use. Interviewing provides the assessment team members with identifiable assessment indicators for each Function appraised. Interviewing also provides the opportunity to address all areas of OMM within the scope of the assessment.
Interviews are scheduled with IT operations managers, supervisors, and operations personnel. IT operations managers and supervisors are interviewed as a group in order to understand their view of how the work is performed in the IT organization, any problem areas of which they are aware, and improvements that they feel need to be made. IT operations personnel are interviewed to collect data within the scope of the assessment and to identify areas that they can and should improve in the IT organization.
Task 1542: Analyzing Work Products and Documentation
Task 1542 includes confirming and supplementing the information from the interviews with other sources. Data for the OMM assessment are obtained from several sources: responses to the maturity questionnaires, interview sessions, and work products and document reviews 1525.
Documents and work products are reviewed in order to verify compliance to process performance.
Task 1543: Following Up to Solidify Data
Task 1543 includes confirming the accuracy and relevance of the information obtained in the "planning stage" 1520, in the interviews 1541 , and in the work products and documentation 1542. The purpose of this activity is to summarize and consolidate information into a manageable set of findings.
Task 1545: Categorizing Data by Function
Task 1545 includes the organization of all relevant data as it pertains to specific IT functions. The data is categorized into the appropriate areas of the OMM. The assessment team will preferably reach consensus on the validity of the data and whether sufficient information in the areas evaluated has been collected. It is the team's responsibility to obtain sufficient information on the OMM components within the scope of the assessment for the required areas of the IT organization before any rating can be done. Follow-up interviews may occur for clarification.
Initial findings are generated from the information collected thus far and presented to the assessment participants. The purpose of presenting initial findings is to obtain feedback from the individuals who provided information during the various interviews. Ratings are not considered until after the initial findings presentations, as the assessment team is still collecting data. Feedback is recorded for the team to consider at the conclusion of all of the initial findings presentations.
Tasks 1546 and 1547: Rating Base Practices and Generic Practices Tasks1546 and 1547 include the assignment of a rating to the generic practices and base practices. This is done by reviewing questionnaire responses, results, interview notes, documentation, and the Assessment Indicator Rating template. The Assessment Indicators are objective attributes or characteristics of a practice or work product that supports an assessor's judgment of performance of an implemented process. The assessment team will use the scoring matrix guideline provided by the OMM framework.
Task 1548: Consolidating Data
Task 1548 includes the assignment of a rating to each Process Attribute. For Level 1 , process attributes are rated based on the existence of and compliance to base practices. For Level 2 and higher, process attribute are rated compliance to generic practices. Each process attribute will receive a rating of Not Achieved, Partially Achieved, Largely Achieved or Fully Achieved. The method used to rate base practices and generic practices is the Analytical Hierarchy Process (AHP) method. The analytic hierarchy process (AHP) is a comprehensive, logical, and structural framework, which is used to improve the understanding of complex decisions by decomposing the problem in a hierarchical structure. The incorporation of all relevant decision criteria, and their pairwise comparison, allows the decision maker to determine the trade-offs among objectives.
Task 1549: Preparing Function Profiles Task 1549 includes the assignment of a rating (N,P,L,F) to each
Category based on the capability ratings of the Functions within the Category using the AHP method. This is done once process attributes have been rated. Assignment of a maturity level rating is optional at the discretion of the sponsoring organization. If the sponsoring organization would like an organizational rating, the AHP method is used to roll up the Category ratings into an organizational maturity level rating.
Step 1570 - Identifying and Resolving Gaps
In step 1570, a gap closure approach for specified functions is developed, including estimates of cost and timing. Figure 4 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks include Determining Continuous Improvement (Cl) Activities 1571 , Identifying Alternatives 1573, Estimating Costs of Improvements 1575, Assessing Timing Implications 1577, and Selecting Cl Initiatives to Start 1579. The products of this step include Continuous Improvement (Cl) initiatives.
Task 1571 : Determining Continuous Improvement (Cl) Activities Task 1571 includes identification of possible Cl Initiatives. A Rating Tool generates a list of areas for improvement based on the activities needed to move to the next capability level. Identification of additional Cl Initiatives may or may not be included in the scope of an OMM Assessment project. If it is not, then this task package would be eliminated from the work plan.
Task 1573: Identifying Alternatives
Task 1573 includes identification of alternate Cl Initiatives. Cl alternatives should consider organization, performance, and process improvement opportunities in addition to software solutions that may be available. The effort put into analyzing, estimating, and documenting the choices should be controlled by the scope authorized by the sponsoring organization at the outset of the project. Some organizations may only wish to know what is available, others may be ready to begin the improvement process immediately and want cost estimates, work plans, staffing requirements and time schedules for priority initiatives.
The provider or manager then estimates costs of the alternatives 1575 and any impact that the alternatives and choices might have on the organization's functioning or the timing of the implementation 1577. The organization then selects the continuous improvement activities it wishes to pursue 1579. This concludes the "performing stage" of the project. In the fourth stage, the "reporting stage", the OMM Capability Blueprints are constructed by function 1580.
Step 1580 - Defining OMM Capability Blueprints
In step 1580, the final project documentation is prepared, a management report is delivered, and the stage is set for subsequent delivery efforts. Figure 5 shows a representation of the tasks for carrying out these functions, according to the presently preferred embodiment of the invention. The tasks include Prioritizing Cl Initiatives 1581 , Developing OMM Capability
Delivery Approach and Cl Plans 1583, Preparing and Presenting Final Results Documents and Presentation 1587, and Modifying Delivery Plans as Needed 1589. The products of this step include OMM assessment report and final results presentation. The selected continuous improvement activities are prioritized 1581 and an approach is developed for these activities 1583. A report or presentation is prepared and presented to the organization's management 1587, and any modifications desired are made 1589 before deployment.
The final assessment results are presented to the OMM sponsor 1587. The sponsor owns the assessment results and is free to use them as he or she sees fit. During the final presentation, the assessment team preferably ensures that the IT organization understands the issues that were discovered during the assessment and the key issues that it faces. Operational strengths are presented to validate what the IT organization is doing well. Strengths and weaknesses are presented for each area within the assessment scope as well as any non-OMM issues that affect process. A profile is presented showing the ratings for each specific area assessed.
An executive overview session is held in order to allow the senior IT Operations manager to clarify any issues with the assessment team, to confirm his or her understanding of the operations process issues, and to gain full understanding of the recommendations report. When the assessment has been completed and findings have been presented, the assessment team collects feedback from the assessment participants and the assessment team on the process, packages information that needs to be saved for historical purposes. Any modifications desired are made 1589 before deployment.
In addition to the method for providing the operations management maturity assessment function, the present invention also includes a method and apparatus for providing an estimate for building the operations management maturity assessment function for an information technology organization. The method and apparatus generate a preliminary work estimate (time by task) and financial estimate (dollars by classification) based on input of a set of estimating factors that identify the scope and difficulty of key aspects to the function.
Previous estimators only gave bottom line cost figures and were directed to business rather than OM functions. It could take days or weeks before an IT consultant produced these figures for the client. If the project resulted in a total cost either above or below the projected estimate, there was no way of telling who or what was responsible for the discrepancy. Therefore, a need exists for an improved estimator. Figure 6 is a flow chart of one embodiment of a method for providing an estimate of the time and cost to build a business recovery planning function in an information technology organization. In Figure 6, a provider of a business recovery planning function such as an IT consultant, for example Andersen Consulting, obtains estimating factors from the client 202. This is a combined effort, with the provider adding expertise and knowledge to help in determining the quantity and difficulty of each factor. Estimating factors represent key business drivers for a given OM function. Table 1 lists and defines the factors to be considered along with examples of a quantity and difficulty rating for each factor.
As an illustration of a preferred embodiment of the invention, the provider determines an estimating factor 202 , such as for the number of Cl Initiative, with the help of the client. Next, the difficulty rating 204 is determined. Each of these determinations depends on the previous experience of the consultant. The provider or consultant with a high level of experience will have a greater likelihood of determining the correct number and difficulty ratings. The number and difficulty ratings are input into a computer program. In the preferred embodiment, the computer program is a spreadsheet, such as EXCEL, by Microsoft Corp. of Redmond, Washington, USA. The consultant and the client will continue to determine the number and difficulty ratings for each of the remaining estimating factors 206.
After the difficulty rating has been determined for all of the estimating factors, this information is transferred to an assumption sheet 208, and the assumptions for each factor are defined. The assumption sheet 208 allows the consultant to enter comments relating to each estimating factor and to document the underlying reasoning for a specific estimating factor.
TABLE 1
Next, an estimating worksheet is generated and reviewed 210 by the consultant, client, or both. An example of a worksheet is shown in Figures 7a- b. The default estimates of the time required for each task will populate the worksheet, with time estimates based on the number factors and difficulty rating previously assigned to the estimating factors that correspond to each task. The amount of time per task is based on a predetermined time per unit required for the estimating factor multiplied by a factor corresponding to the level of difficulty. Each task listed on the worksheet is described above in connection with details of the method for providing the business recovery planning function. The same numbers in the description of the method above correspond to the same steps, tasks, and task packages of activities shown on the worksheet of Figures 7a-b. The worksheet is reviewed 210 by the provider and the client for accuracy. Adjustments can be made to task level estimates by either returning to the factors sheet and adjusting the units 212 or by entering an override estimate in the 'Used' column 214 on the worksheet. This override may be used when the estimating factor produces a task estimate that is not appropriate for the task, for example, when a task is not required on a particular project. Next, the provider and the client review and adjust, if necessary, the personnel time staffing factors for allocations 216 for the seniority levels of personnel needed for the project. Referring to Figures 7a-b, these columns are designated as Partner - "Ptnr", Manager - "Mgr", Consultant - "Cnslt", and Analyst - "Anlst", respectively. These allocations are adjusted to meet project requirements and are typically based on experience with delivering various stages of a project. It should be noted that the staffing factors should add up to 1.
The consultant or provider and the client then review the work plan 218, and may optionally include labor to be provided by the client. In one embodiment, the work plan contains the total time required in days per stage and in days per task required to complete the project. Tasks may be aggregated into a "task package" of subtasks or activities for convenience. A worksheet, as shown in Figures 7a-b, may also be used for convenience. This worksheet may be used to adjust tasks or times as desired, from the experience of the provider, the customer, or both.
Finally, a financial estimate is generated in which the provider and client enter the agreed upon billing rates for Ptnr, Mgr, Cnslt, and Anlst 220. The total estimated payroll cost for the project will then be computed and displayed, generating final estimates. A determination of out-of-pocket expenses 222 may then be applied to the final estimates to determine a final project cost 224. Preferably, the provider will review the final estimates with an internal functional expert 226.
Other costs may also be added to the project, such as hardware and software purchase costs, project management costs, and the like. Typically, project management costs for managing the provider's work are included in the estimator. These are task dependant and usually run between 10 and 15% of the tasks being managed, depending on the level of difficulty. These management allocations may appear on the worksheet and work plan. The time allocations for planning and managing a project are typically broken down for each of a plurality of task packages where the task packages are planning project execution 920, organizing project resources 940, controlling project work 960, and completing project 990, as shown in Figure 6.
It will be appreciated that a wide range of changes and modifications to the method as described are contemplated. Accordingly, while preferred embodiments have been shown and described in detail by way of examples, further modifications and embodiments are possible without departing from the scope of the invention as defined by the examples set forth. It is therefore intended that the invention be defined by the appended claims and all legal equivalents.

Claims

1. A method for providing an operations maturity model (OMM) assessment function for an information technology organization, the method comprising: (a) defining capability requirements for said OMM assessment function;
(b) assessing current capabilities for said OMM assessment function;
(c) identifying and resolving gaps for said OMM assessment function; and
(d) developing OM capability blueprints for said OMM assessment function.
2. The method of claim 1 wherein said defining includes at least one of the following: (e) defining scope and objectives;
(f) executing and evaluating OMM questionnaires
(g) obtaining work products and documentation; and (h) creating and conducting OMM kickoff presentations.
3. The method of claim 1 wherein said assessing includes at least one of the following:
(e) scheduling and conducting function interviews;
(f) analyzing work products and documentation;
(g) following up to solidify data; (h) categorizing data by function; (i) rating base practices;
(j) rating generic practices;
(k) consolidating data; and
(I) preparing function profiles.
4. The method of claim 1 wherein said identifying and resolving includes at least one of the following:
(e) determining continuous improvement initiatives;
(f) identifying alternatives; (g) estimating costs of improvements;
(h) assessing timing implications; and (i) selecting to continuous improvement initiatives start.
5. The method of claim 1 wherein said developing includes at least one of the following: (e) prioritizing continuous improvement initiatives;
(f) developing capability delivery approach;
(g) preparing and presenting final results documentation and presentation; and
(h) modifying delivery plans as needed.
6. A method for providing an estimate for building an OMM assessment function in an information technology organization, the method comprising:
(a) obtaining a plurality of estimating factors;
(b) determining a difficulty rating for each of said estimating factors;
(c) generating a time allocation for building said business recovery planning based on said estimating factor and said difficulty rating; and
(d) generating a cost for building said OMM assessment based on said time allocation.
7. The method as recited in claim 6, wherein obtaining said estimating factor further includes receiving said estimating factors from a client.
8. The method as recited in claim 7, wherein said estimating factors include the number of at least one of continuous improvement initiatives, functions, and levels.
9. The method as recited in claim 6, wherein said difficulty rating is selected from the group of simple, moderate, or complex.
10. The method as recited in claim 6, wherein said time allocation includes time allocated for a plurality of individual team members where said individual team members include at least one of partner, manager, consultant, and analyst.
11. The method as recited in claim 6, wherein said cost depends on said time allocation and a pay rate for said individual team member.
12. The method as recited in claim 6, wherein said cost is broken down for each of a plurality of stages for building said OMM assessment function where said stages include at least one of plan and manage and business architecture stages.
13. The method as recited in claim 6, wherein said time allocation is used to generate a project work plan.
14. The method as recited in claim 6, wherein said pay rate is used to generate a financial summary of said cost.
15. The method as recited in claim 6, wherein said work plan is broken down for each of a plurality of stages for building said OMM assessment where said stages are plan and manage and business architecture stages.
16. The method as recited in claim 6, wherein said plan and manage stage is broken down for each of a plurality of task packages where said task packages are plan project execution, organize project resources, control project work, and project complete.
17. A computer system for allocating time and computing cost for building an OMM assessment function in an information technology system, comprising:
(a) a processor;
(b) a software program for receiving a plurality of estimating factors and difficulty rating for each of said estimating factors and generating a time allocation and cost for building said OMM assessment; and
(c) a memory that stores said time allocation and cost under control of said processor.
EP00973433A 1999-10-06 2000-10-06 Method and estimator for providing operations maturity model assessment Withdrawn EP1226523A4 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15825999P 1999-10-06 1999-10-06
US158259P 1999-10-06
PCT/US2000/027856 WO2001025970A1 (en) 1999-10-06 2000-10-06 Method and estimator for providing operations maturity model assessment

Publications (2)

Publication Number Publication Date
EP1226523A1 EP1226523A1 (en) 2002-07-31
EP1226523A4 true EP1226523A4 (en) 2003-02-19

Family

ID=22567316

Family Applications (2)

Application Number Title Priority Date Filing Date
EP00979124A Withdrawn EP1222510A4 (en) 1999-10-06 2000-10-06 Organization of information technology functions
EP00973433A Withdrawn EP1226523A4 (en) 1999-10-06 2000-10-06 Method and estimator for providing operations maturity model assessment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP00979124A Withdrawn EP1222510A4 (en) 1999-10-06 2000-10-06 Organization of information technology functions

Country Status (4)

Country Link
EP (2) EP1222510A4 (en)
AU (12) AU1653901A (en)
CA (1) CA2386788A1 (en)
WO (12) WO2001026014A1 (en)

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2002256550A1 (en) * 2000-12-11 2002-06-24 Skill Development Associates Ltd Integrated business management system
US7937281B2 (en) 2001-12-07 2011-05-03 Accenture Global Services Limited Accelerated process improvement framework
US7035809B2 (en) * 2001-12-07 2006-04-25 Accenture Global Services Gmbh Accelerated process improvement framework
AU2003282996A1 (en) 2002-10-25 2004-05-25 Science Applications International Corporation Determining performance level capabilities using predetermined model criteria
DE10331207A1 (en) 2003-07-10 2005-01-27 Daimlerchrysler Ag Method and apparatus for predicting failure frequency
US8572003B2 (en) * 2003-07-18 2013-10-29 Sap Ag Standardized computer system total cost of ownership assessments and benchmarking
US8566147B2 (en) * 2005-10-25 2013-10-22 International Business Machines Corporation Determining the progress of adoption and alignment of information technology capabilities and on-demand capabilities by an organization
EP1808803A1 (en) * 2005-12-15 2007-07-18 International Business Machines Corporation System and method for automatically selecting one or more metrics for performing a CMMI evaluation
US8457297B2 (en) 2005-12-30 2013-06-04 Aspect Software, Inc. Distributing transactions among transaction processing systems
US8355938B2 (en) 2006-01-05 2013-01-15 Wells Fargo Bank, N.A. Capacity management index system and method
US7523082B2 (en) * 2006-05-08 2009-04-21 Aspect Software Inc Escalating online expert help
US20080208667A1 (en) * 2007-02-26 2008-08-28 Gregg Lymbery Method for multi-sourcing technology based services
EP2210227A2 (en) * 2007-10-25 2010-07-28 Markport Limited Modification of service delivery infrastructure in communication networks
US8326660B2 (en) 2008-01-07 2012-12-04 International Business Machines Corporation Automated derivation of response time service level objectives
US8320246B2 (en) * 2009-02-19 2012-11-27 Bridgewater Systems Corp. Adaptive window size for network fair usage controls
US8200188B2 (en) 2009-02-20 2012-06-12 Bridgewater Systems Corp. System and method for adaptive fair usage controls in wireless networks
US9203629B2 (en) 2009-05-04 2015-12-01 Bridgewater Systems Corp. System and methods for user-centric mobile device-based data communications cost monitoring and control
US8577329B2 (en) 2009-05-04 2013-11-05 Bridgewater Systems Corp. System and methods for carrier-centric mobile device data communications cost monitoring and control
US20110066476A1 (en) * 2009-09-15 2011-03-17 Joseph Fernard Lewis Business management assessment and consulting assistance system and associated method
US20110231229A1 (en) * 2010-03-22 2011-09-22 Computer Associates Think, Inc. Hybrid Software Component and Service Catalog
WO2012057747A1 (en) 2010-10-27 2012-05-03 Hewlett-Packard Development Company, L.P. Systems and methods for scheduling changes
WO2015126409A1 (en) 2014-02-21 2015-08-27 Hewlett-Packard Development Company, L.P. Migrating cloud resources
US10148757B2 (en) 2014-02-21 2018-12-04 Hewlett Packard Enterprise Development Lp Migrating cloud resources
US20170032297A1 (en) * 2014-04-03 2017-02-02 Dale Chalfant Systems and Methods for Increasing Capability of Systems of Business Through Maturity Evolution
US10044786B2 (en) 2014-11-16 2018-08-07 International Business Machines Corporation Predicting performance by analytically solving a queueing network model
US9984044B2 (en) 2014-11-16 2018-05-29 International Business Machines Corporation Predicting performance regression of a computer system with a complex queuing network model
US10460272B2 (en) * 2016-02-25 2019-10-29 Accenture Global Solutions Limited Client services reporting
CN106682385B (en) * 2016-09-30 2020-02-11 广州英康唯尔互联网服务有限公司 Health information interaction system
JP7246407B2 (en) * 2018-04-16 2023-03-27 クラウドブルー エルエルシー Systems and methods for aligning revenue streams in a cloud service broker platform
US11481711B2 (en) 2018-06-01 2022-10-25 Walmart Apollo, Llc System and method for modifying capacity for new facilities
CA3101836A1 (en) 2018-06-01 2019-12-05 Walmart Apollo, Llc Automated slot adjustment tool
US11483350B2 (en) 2019-03-29 2022-10-25 Amazon Technologies, Inc. Intent-based governance service
CN110096423A (en) * 2019-05-14 2019-08-06 深圳供电局有限公司 A kind of server memory capacity analyzing and predicting method based on big data analysis
US11119877B2 (en) 2019-09-16 2021-09-14 Dell Products L.P. Component life cycle test categorization and optimization
WO2021096893A1 (en) * 2019-11-11 2021-05-20 Snapit Solutions Llc System for producing and delivering information technology products and services
US11288150B2 (en) 2019-11-18 2022-03-29 Sungard Availability Services, Lp Recovery maturity index (RMI)-based control of disaster recovery
US20210160143A1 (en) 2019-11-27 2021-05-27 Vmware, Inc. Information technology (it) toplogy solutions according to operational goals
US11501237B2 (en) 2020-08-04 2022-11-15 International Business Machines Corporation Optimized estimates for support characteristics for operational systems
US11329896B1 (en) 2021-02-11 2022-05-10 Kyndryl, Inc. Cognitive data protection and disaster recovery policy management

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4827423A (en) * 1987-01-20 1989-05-02 R. J. Reynolds Tobacco Company Computer integrated manufacturing system
JPH03111969A (en) * 1989-09-27 1991-05-13 Hitachi Ltd Method for supporting plan formation
US5233513A (en) * 1989-12-28 1993-08-03 Doyle William P Business modeling, software engineering and prototyping method and apparatus
WO1993012488A1 (en) * 1991-12-13 1993-06-24 White Leonard R Measurement analysis software system and method
US5701419A (en) * 1992-03-06 1997-12-23 Bell Atlantic Network Services, Inc. Telecommunications service creation apparatus and method
US5586021A (en) * 1992-03-24 1996-12-17 Texas Instruments Incorporated Method and system for production planning
US5646049A (en) * 1992-03-27 1997-07-08 Abbott Laboratories Scheduling operation of an automated analytical system
US5978811A (en) * 1992-07-29 1999-11-02 Texas Instruments Incorporated Information repository system and method for modeling data
US5630069A (en) * 1993-01-15 1997-05-13 Action Technologies, Inc. Method and apparatus for creating workflow maps of business processes
US5819270A (en) * 1993-02-25 1998-10-06 Massachusetts Institute Of Technology Computer system for displaying representations of processes
CA2118885C (en) * 1993-04-29 2005-05-24 Conrad K. Teran Process control system
AU7207194A (en) * 1993-06-16 1995-01-03 Electronic Data Systems Corporation Process management system
US5485574A (en) * 1993-11-04 1996-01-16 Microsoft Corporation Operating system based performance monitoring of programs
US5724262A (en) * 1994-05-31 1998-03-03 Paradyne Corporation Method for measuring the usability of a system and for task analysis and re-engineering
US5563951A (en) * 1994-07-25 1996-10-08 Interval Research Corporation Audio interface garment and communication system for use therewith
US5745880A (en) * 1994-10-03 1998-04-28 The Sabre Group, Inc. System to predict optimum computer platform
JP3315844B2 (en) * 1994-12-09 2002-08-19 株式会社東芝 Scheduling device and scheduling method
JPH08320855A (en) * 1995-05-24 1996-12-03 Hitachi Ltd Method and system for evaluating system introduction effect
EP0770967A3 (en) * 1995-10-26 1998-12-30 Koninklijke Philips Electronics N.V. Decision support system for the management of an agile supply chain
US5875431A (en) * 1996-03-15 1999-02-23 Heckman; Frank Legal strategic analysis planning and evaluation control system and method
US5960417A (en) * 1996-03-19 1999-09-28 Vanguard International Semiconductor Corporation IC manufacturing costing control system and process
US5793632A (en) * 1996-03-26 1998-08-11 Lockheed Martin Corporation Cost estimating system using parametric estimating and providing a split of labor and material costs
US5960200A (en) * 1996-05-03 1999-09-28 I-Cube System to transition an enterprise to a distributed infrastructure
US5673382A (en) * 1996-05-30 1997-09-30 International Business Machines Corporation Automated management of off-site storage volumes for disaster recovery
US5864483A (en) * 1996-08-01 1999-01-26 Electronic Data Systems Corporation Monitoring of service delivery or product manufacturing
US5974395A (en) * 1996-08-21 1999-10-26 I2 Technologies, Inc. System and method for extended enterprise planning across a supply chain
US5930762A (en) * 1996-09-24 1999-07-27 Rco Software Limited Computer aided risk management in multiple-parameter physical systems
US6044354A (en) * 1996-12-19 2000-03-28 Sprint Communications Company, L.P. Computer-based product planning system
US5903478A (en) * 1997-03-10 1999-05-11 Ncr Corporation Method for displaying an IT (Information Technology) architecture visual model in a symbol-based decision rational table
US6028602A (en) * 1997-05-30 2000-02-22 Telefonaktiebolaget Lm Ericsson Method for managing contents of a hierarchical data model
US6106569A (en) * 1997-08-14 2000-08-22 International Business Machines Corporation Method of developing a software system using object oriented technology
US6092047A (en) * 1997-10-07 2000-07-18 Benefits Technologies, Inc. Apparatus and method of composing a plan of flexible benefits
US6131099A (en) * 1997-11-03 2000-10-10 Moore U.S.A. Inc. Print and mail business recovery configuration method and system
US6119097A (en) * 1997-11-26 2000-09-12 Executing The Numbers, Inc. System and method for quantification of human performance factors
US6157916A (en) * 1998-06-17 2000-12-05 The Hoffman Group Method and apparatus to control the operating speed of a papermaking facility

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
No Search *
See also references of WO0125970A1 *

Also Published As

Publication number Publication date
WO2001025876A3 (en) 2001-08-30
AU8001700A (en) 2001-05-10
WO2001026011A1 (en) 2001-04-12
AU7756600A (en) 2001-05-10
EP1222510A2 (en) 2002-07-17
WO2001026008A1 (en) 2001-04-12
AU1193801A (en) 2001-05-10
WO2001026012A1 (en) 2001-04-12
AU7996100A (en) 2001-05-10
WO2001026010A1 (en) 2001-04-12
WO2001026013A1 (en) 2001-04-12
WO2001025970A1 (en) 2001-04-12
WO2001026005A1 (en) 2001-04-12
WO2001025876A2 (en) 2001-04-12
WO2001026028A1 (en) 2001-04-12
AU1653901A (en) 2001-05-10
EP1226523A1 (en) 2002-07-31
AU7866600A (en) 2001-05-10
WO2001025877A3 (en) 2001-09-07
WO2001026028A8 (en) 2001-07-26
WO2001025970A8 (en) 2001-09-27
AU1431801A (en) 2001-05-10
EP1222510A4 (en) 2007-10-31
WO2001026014A1 (en) 2001-04-12
WO2001026007A1 (en) 2001-04-12
AU1431701A (en) 2001-05-10
AU1193601A (en) 2001-05-10
WO2001025877A2 (en) 2001-04-12
AU7996000A (en) 2001-05-10
CA2386788A1 (en) 2001-04-12
AU7861800A (en) 2001-05-10
AU8001800A (en) 2001-05-10

Similar Documents

Publication Publication Date Title
WO2001025970A1 (en) Method and estimator for providing operations maturity model assessment
US6738736B1 (en) Method and estimator for providing capacacity modeling and planning
US8712826B2 (en) Method for measuring and improving organization effectiveness
US8244565B2 (en) Individual productivity and utilization tracking tool
US8548840B2 (en) Method and system for managing a strategic plan via defining and aligning strategic plan elements
US8799210B2 (en) Framework for supporting transition of one or more applications of an organization
US6968312B1 (en) System and method for measuring and managing performance in an information technology organization
CA2510091A1 (en) Information technology transformation assessment tools
WO1997031320A1 (en) Strategic management system
US20040181446A1 (en) Method, system and apparatus for managing workflow in a workplace
Moura et al. Research challenges of Business-Driven IT management
Heier et al. Examining the relationship between IT governance software and business value of IT: Evidence from four case studies
Terrell et al. Earned Value Management (EVM) Implementation Handbook
Kwak A systematic approach to evaluate quantitative impacts of project management (PM)
Praeg et al. Perspectives of IT-service quality management: a concept for life cycle based quality management of IT-services
McKeown et al. Evaluation of a metrics framework for product and process integrity
Johanning The Process Organization of IT: Which IT Processes and Structures Does a Modern and Lean IT Organization of the Future Need?
Sadler Reference Guide for Project-Control Account Managers
Ngala A Framework for User Involvement in Enterprise Resource Planning System Implementation
AU2010201888B2 (en) Individual productivity and utilization tracking tool
Gamble Measuring effectiveness of constant work in progress system in increasing human resources technology throughput
Mielke IQ PRINCIPLES IN SOFTWARE DEVELOPMENT: IQ-2005
Rosink The delivery mode: the effect of differentiation in needs and strength of coupling
Clapp et al. A guide to conducting independent technical assessments
Eidegård Project factors-A possible way to create a data bank of project experience.

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20020506

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

A4 Supplementary search report drawn up and despatched

Effective date: 20021230

RIC1 Information provided on ipc code assigned before grant

Ipc: 7G 06F 17/00 B

Ipc: 7G 06F 17/30 A

Ipc: 7G 06F 17/60 B

D17P Request for examination filed (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20030507