Umbrella Activities

25
Umbrella Activities Formal Methods In addition to the framework activities, the APM defines a set of umbrella activities that persist across the entire software process. These umbrella activities include: software project management formal technical reviews software quality assurance software configuration management reusability management measurement document preparation and production risk management Each of these umbrella activities is defined by a set of tasks that are adapted to the project type and degree of rigor with which software engineering is to be applied. Software Project Management U.1.1 Perform an adaptation criteria analysis. Intent: The intent of this task is to establish the basis from which the appropriate APM task set will be selected. Mechanics: Selecting on the Task Set for a Project Application of Formal Methods: Selection scheme described in Selecting on the Task Set for a Project Application of CASE Tools: none SQA Checklist: none Do’s & Don’ts: Do: Be as honest as possible in addressing the adaptation criteria questionnaire. Do: Use common sense and experience to select the appropriate task set as well as the results of the adaptation criteria analysis. Don’t: Eliminate the possibility of defining a boundary project (i.e., one that spans two project categories and uses tasks adapted from each).

Transcript of Umbrella Activities

Page 1: Umbrella Activities

Umbrella Activities

Formal Methods

In addition to the framework activities, the APM defines a set of umbrella activities that persist across the entire software process. These umbrella activities include:

• software project management

• formal technical reviews

• software quality assurance

• software configuration management

• reusability management

• measurement

• document preparation and production

• risk management

Each of these umbrella activities is defined by a set of tasks that are adapted to the project type and degree of rigor with which software engineering is to be applied.

Software Project Management

U.1.1 Perform an adaptation criteria analysis.

Intent: The intent of this task is to establish the basis from which the appropriate APM task set will be selected.

Mechanics: Selecting on the Task Set for a Project

Application of Formal Methods: Selection scheme described in Selecting on the Task Set for a Project

Application of CASE Tools: none

SQA Checklist: none

Do’s & Don’ts:

Do: Be as honest as possible in addressing the adaptation criteria questionnaire.

Do: Use common sense and experience to select the appropriate task set as well as the results of the adaptation criteria analysis.

Don’t: Eliminate the possibility of defining a boundary project (i.e., one that spans two project categories and uses tasks adapted from each).

Page 2: Umbrella Activities

Helpful Hints:

1. When addressing each of the adaptation characteristics, be certain to err on the side of a conservatism. That is, is you can’t decide between a grade of 2 or 3 for a particular criterion, it’s usually best to opt for the grade of 3.

2. Don’t abandon common sense! Even if the task set selector points to one level of software engineering rigor, other circumstances may mitigate toward the selection of a more or less formal approach. Consider the entire picture

Deliverables: The task set selector (TSS) grade

U.1.2 Select the software engineering task set.

U.1.2.1 Use the task set selector to guide the choice of task set.

U.1.2.1 Define specific subtasks for the project.

U.1.2.2 Define specific milestones and deliverables based on task set chosen.

U.1.2.3 Define SQA points throughout the project. See also Task U.3.

Intent: The intent of this task is to select the software engineering task set that is to be used on your project. The task set becomes your work breakdown structure&endash;the set of tasks that will be scheduled and tracked as the project proceeds.

Mechanics: Using the results of tasks U.1.1 and U.1.2, the appropriate task set is chosen.

Application of Formal Methods: none

Application of CASE Tools: none

SQA Checklist: none

Do’s & Don’ts

Do: Use one of the predefined task sets

Do: Amend the task set if special conditions exist.

Don’t: Forget umbrella activities. The tasks associated with them will also be part of your project.

Helpful Hints

1. Always begin with one of the pre-defined task sets, and if at all possible, use it as the work breakdown structure for your project. However, you may decide that one aspect of your project demands more rigor than the task set implies, it is both permissible and recommended that you mix and match portions of two different task sets.

2. Be certain to identify specific deliverables (their outline, format and content) before leaving this task.

Deliverables: Task set for the project and list of project deliverables

U.1.3 Bound the scope of the software effort.

Intent: The intent of this task is to determine project scope after basic requirements have been identified.

Mechanics: Meeting with the customer (e.g., JAD meetings) may be conducted or a functional specification (if one already exists) can be examined.

Page 3: Umbrella Activities

Application of Formal Methods: joint application design can be used to elicit the project scope

Application of CASE Tools: none

SQA Checklist:

1. Does the statement of scope avoid ambiguity?

2. Are all quantitative references (e.g., "many, some, a few") bounded by an actual value?

3. Is the statement of scope consistent?

4. Does the statement of scope discuss the input, processing and output for the application?

5. Does it discuss the producers of input and the consumers of output?

6. Does it mention systems that must be interoperable with the application?

Do’s & Don’ts

Do: Try to meet face to face with the customer.

Do: Structure your meeting in a manner that will make it most effective.

Don’t: Allow ambiguous requirements to guide your planning effort.

Helpful Hints

1. Work closely with the customer during this task. If the customer "doesn’t have the time to meet with you," delay work until time can be scheduled.

2. The statement of scope should be written in a way that enables any reader to get a picture of what the application is supposed to accomplish. It should be capable of standing on its own.

Deliverables: Statement of scope

U.1.4 Decompose product functionality.

Intent: The intent of this task is to define major software functions. In essence, a "function tree" is created so that estimates can be generated in Tasks U.1.5, U.1.6, and U.1.7.

Mechanics: One way to accomplish decomposition is to perform a ‘grammatical parse’ on the statement of scope. All verbs in the statement of scope are candidate functions, at a first level of refinement.

Application of Formal Methods: grammatical parse on the statement of scope

Application of CASE Tools: none

SQA Checklist:

1. Does the decomposition from one level of the tree to the next expand by no more that five functions per node?

2. Are all functions noted at one level of the tree represented at the same level of abstraction?

3. Can you (or someone else on the project team) write a one paragraph description of each function?

4. Can you (or someone else on the project team) describe the data that flow into and out of each function?

Page 4: Umbrella Activities

5. Have functions been defined to manage interactions with the user? with the operating system? with the external environment?

Do’s & Don’ts:

Do: Maintain the same level of abstraction at every level of the function tree.

Do: Write a sentence or two that defines each function.

Don’t: Try to develop a comprehensive design of the system. You’re simply trying to establish a basis for estimation.

Helpful Hints:

Look at the functions and ask yourself: Can we transform all inputs into outputs using these functions?

Deliverables: Function tree and description

U.1.5 Estimate the size (in LOC or FP) of each product function.

Intent: The intent of this task is to estimate the size in LOC (lines of code) or function points for each function or subfunction defined in Task U.1.4.

Mechanics: Each function is selected from the hierarchy derived in Task U.1.4. Experienced technical staff estimate LOC or alternatively, use information domain characteristics of the function to estimate function points. If size cannot be determined, the function should be further decomposed until estimates are possible.

Application of Formal Methods: function point analysis

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Are staff members doing the estimates experienced with software development in this application area?

2. Does the estimate of size conform to actual size of similar applications already completed.?

3. Has a convention for LOC been established? Alternatively, have conventions for function point analysis been defined for the team?

Do’s & Don’ts

Do: Get a number of different people to generate the estimates independently.

Do: Use more than one data point in generating your final estimate.

Don’t: Expect +/- 1 percent accuracy. It is not achievable.

Helpful Hints

Create an historical table of functions developed for past projects. Note the size and/or function points for each function in the table. This will help you to estimate similar functions for the new project.

Page 5: Umbrella Activities

Deliverables: Estimate of size (in LOC or function points)

U.1.6 Acquire historical data from development efforts for similar projects/products.

Intent: The intent of this task is to use data acquired from past projects and develop values for production rate (LOC/person-month or FP/person-month). These data are used in the estimation approach defined in Task U.1.8 and U.1.9.

Mechanics: Past projects are revisited and applied effort for each is determined. In addition, the total LOC and/or function points are determined. Production rates are calculated. Average values may be categorized by application category and project type.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Is the standard deviation of average production rate for each value reasonably low?

2. Is the effort data collected from the projects derived from reliable sources?

Do’s & Don’ts

Do: Try to collect data over the past two or three years.

Do: Make note of significant environmental changes (e.g., new tools, new people) that may have had a pronounced effect on production rate.

Don’t: Expect little scatter in the data. There will be scatter.

Don’t: Rely on qualitative estimates. Get quantitative data.

Helpful Hints

1. Be sure you account for part time labor on projects. Not everyone working on the project worked full time. Also, be sure to account for contractors who may have worked on a project.

2. Account for any reused code that added to the total size produced.

Deliverables: Productivity rate table

U.1.7 Develop a project estimate using a task-function-effort table.

Intent: The intent of this task is to develop a matrix of software engineering tasks (determined after the appropriate task set has been selected) and product functions. Effort estimates are made for each cell in the matrix.

Mechanics: See Pressman, R.S., A Manager’s Guide to Software Engineering,McGraw-Hill, 1993, p. 229-235.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

Page 6: Umbrella Activities

SQA Checklist:

1. Is approximate 40 to 50 percent of all effort allocated to tasks that precede code generation tasks?

2. Does testing absorb at least 30 percent of overall effort?

Do’s & Don’ts

Do: Estimate APM major tasks only.

Don’t: Assume that all functions will require the same level of effort. Account for differing level of complexity.

Helpful Hints

Compare the estimate developed using this technique with an ‘experiential estimate’ (i.e., a gross estimate of effort using past experience on similar projects) made independently.

Deliverables: Estimate of effort in work-units (e.g., person-months)

U.1.8 Develop a project estimate using a size-oriented approach.

Intent: The intent of this task is to develop a table of product functions and then generate efforts estimates from the table.

Mechanics: Develop size estimates for each function using data derived from task U.1.5. Data derived in Task U.1.6 are used to compute estimated effort for each function. See Pressman, R.S., A Manager’s Guide to Software Engineering, McGraw-Hill, 1993, p. 225-229.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist: none

Do’s & Don’ts

Do: Adjust the average productivity rate to account for the complexity of a function or for the entire system.

Don’t: Force this estimate to conform to earlier estimate derived in Task U.1.7.

Helpful Hints

The results of Tasks U.1.5 and U.1.6 provide necessary input for this task. If productivity rates are unavailable for your organization a gross estimate may be derived by using the following values: 1000 LOC/person-month or approximately 8 - 10 function points/person-month.

Deliverables: Estimate of effort in work-units (e.g., person-months)

U.1.9 Develop a project estimate using an empirical model.

Page 7: Umbrella Activities

Intent: The intent of this task is to use an empirical estimation model (e.g., COCOMO II) to estimate project effort and duration.

Mechanics: See Pressman, R.S., A Manager’s Guide to Software Engineering,McGraw-Hill, 1993, p. 235-240.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Has the empirical model been calibrated to the local environment?

2. Has this and other estimates taken umbrella activities into account?

Do’s & Don’ts

Do: Use size value derived from Task U.1.5.

Do: Be sure that you understand the assumptions that form the basis of any empirical model.

Don’t: Assume that you can divide the estimated effort by the projected project duration and get an accurate estimate of the number of people required to staff the project team.

Helpful Hints: none

Deliverables:

1. Estimate of effort in work-units (e.g., person-months)

2. Project duration in months

U.1.10 Estimate all project resources (i.e., people, skills, hardware, facilities, etc.).

U.1.10.1 Define skills required for the project and windows when particular skills will be required. This may be delayed until U.1.15 is complete.

U.1.10.2 List special hardware and/or software tools required for this project. Check availability of these tools.

U.1.10.3 Determine whether special facilities or environments are required for the project.

U.1.10.4 Specify training needs for project staff and establish a training schedule that will deliver knowledge just-in-time.

Intent: The intent of this task is to define specific project resources. How many people? What special tools? What facilities? and other questions must be answered.

Mechanics: The results of tasks U.1.1 through U.1.9 are reviewed and estimated of project resources are developed.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

Page 8: Umbrella Activities

1. Has availability of specialized human skills or specialized hardware/software or other technology been taken into account?

2. Is training budgeted for all project staff.

Do’s & Don’ts

Don’t: Assume that resources will be available when you need them. Be sure to create contingency plans for resources that are unavailable when needed.

Helpful Hints: none

Deliverables: List of special skills, hardware/software resources, special facilities, and training needs

U.1.11 Reconcile estimates and develop a combined estimate.

Intent: The intent of this task is to reconcile the estimates developed in tasks U.1.7 through U.1.9 and derive a combined estimate that will be used for budgeting and scheduling.

Mechanics: The different estimation data points are evaluated, differences are reconciled, and a combined estimate is derived.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist: none

Do’s & Don’ts

Don’t: Assume that estimation data points will be within +/- 5 percent. In general, accuracy is within +/- 20 percent.

Helpful Hints

Emotion and past experience play a role in this activity. Assign each estimate a ‘degree of comfort’ and use this designation as an indicator as you work to come up with a combined estimate. You should give more weight to estimates with high degrees of comfort.

Deliverables: combined estimate of effort and duration

U.1.12 Perform risk analysis. [For more detail, see Tasks U.8]

U.1.12.1 Identify all project and technology risks.

U.1.12.2 Estimate risk probability and impact and establish "cut-offs."

U.1.12.3 Develop risk mitigation, monitoring and management plan for all high probability/impact risks.

Intent: The intent of this task is to develop a plan for mitigating risks so that they are avoided. Also identify ways to monitor the project so that potential problems become visible as early as possible. Finally, develop a plan for managing risks that do become real problems.

Mechanics: See Pressman, R.S., A Manager’s Guide to Software Engineering,McGraw-Hill, 1993, p. 245 - 271.

Page 9: Umbrella Activities

Application of Formal Methods: formal risk analysis

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Has an explicit risk mitigation, monitoring and management plan been defined for the project?

2. Have you provided contingency plans should a risk become real?

3. Have you provided realistic work-arounds for serious risks?

Do’s & Don’ts

Don’t: Assume that risks will remain static. A low probability risk at the beginning of a project may migrate to a high probability risk as the project progresses. Adjust your planning accordingly.

Helpful Hints

Create a risk table and revisit it on a regular basis

Deliverables:

1. Risk table

2. Risk Mitigation, Monitoring and Management Plan

U.1.13 Develop a detailed project schedule.

Intent: The intent of this task is to create a task network, a timeline chart, resource tables and other scheduling information.

Mechanics: See Pressman, R.S., A Manager’s Guide to Software Engineering,McGraw-Hill, 1993, Chapter 12.

Application of Formal Methods: critical path scheduling

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Have task dependencies been defined?

2. Have umbrella tasks been scheduled, where appropriate?

3. Have formal technical reviews been scheduled?

4. Has time been scheduled for iteration?

5. Do you show milestones that are closely spaced?

6. Does each major task have a deliverable associated with it?

Do’s & Don’ts

Page 10: Umbrella Activities

Do: Be certain to consider task interdependencies.

Do: Keep your schedule progress up to date.

Do: Examine the critical path on a daily or weekly basis.

Don’t: Scratch out a schedule on a note pad. Use the tools provide to you.

Helpful Hints

If schedule risk is high, err on the side of micro-management. Define closely spaced milestones and small tasks. Track progress in small increments.

Deliverables:

1. Project schedule

2. Supplementary information generated by tools

U.1.14 Establish a SQA plan for the project.

For disciplined and rigorous projects, develop an SQA Plan that defined the quality assurance tasks required for the project.

U.1.15 Establish an SCM plan for the project

For disciplined and rigorous projects, develop an SCM Plan that defines the change management tasks required for the project.

U.1.16 Establish a project monitoring and tracking approach for use throughout the project.

Develop mechanisms for tracking progress and problems on a regular basis. Options include regularly scheduled status meeting; progress reports submitted by each team member; E-mail reports; informal meeting. U.1.17 Integrate all planning information into a Project Plan.

Develop a Software Project Plan that integrates all information developed in Tasks U.1.1 through U.1.16.

U.1.18 Review the plan and modify as required.

Review the plan with management, the project requester and staff to determine its viability.

U.1.20 Conduct project monitoring and tracking on an on-going basis.

U.1.20.1 Collect progress report information from all project team members.

Page 11: Umbrella Activities

U.1.20.2 Flag problems and develop a mitigation strategy.

U.1.20.3 Monitor all formal technical reviews and use their status as a final indicator of progress.

U.1.20.4 Update project schedule as actual task completion dates are reported.

U.1.20.5 Track all change requests and assess their impact on schedule.

Intent: The intent of this task is to track progress and problems that are encountered as other software engineering work occurs..

Mechanics: Monitoring and tracking are by two mechanisms: (1) regular reporting and (2) human-to-human contact.

Application of Formal Methods: critical path scheduling

Application of CASE Tools: t.b.d.

SQA Checklist: none

Do’s & Don’ts

Do: Spend time with each project team members discussing accomplishments and problems. Do this frequently.

Do: Encourage team members to voice their concerns.

Do: Use the results of formal technical reviews as a good indicator of progress.

Don’t: Be macho or establish a unyielding "can-do" atmosphere that refuses to recognize problems.

Don’t: Discourage team members from reporting progress honestly by punishing people when schedules slip.

Helpful Hints

If schedule risk is high, err on the side of micro-management. Define closely spaced milestones and small tasks and track progress in small increments.

Deliverables:

1. Project progress reports

2. Updates to the project schedule and related information

U.1.21 Collect project and process metrics.

See Task U.7.

U.1.22 Modify project estimates/schedule based on real-time information obtained as part of monitoring and tracking.

Based on information obtained from Task U.1.20, update and analyze the project plan regularly.

Page 12: Umbrella Activities

Formal Technical Reviews

U.2.1 Establish a review strategy before completing the project plan.

U.2.1.1 Select an appropriate set of review guidelines as part of the SQA plan developed in Task U.1.13.

U.2.1.2 Identify review points for the project.

U.2.1.3 Schedule review points as project tasks.

The project manager and staff should select review points for the project. Questions answered include: Where in the process flow should reviews be conducted? How many reviews should be conducted for each CPF activity? The project plan/schedule should identify review points explicitly and should allocate effort and duration to each. In addition, the plan should allocate effort to the work required to respond to issues uncovered during review.

U.2.2 Initiate formal technical reviews as project deliverables are produced.

Formal technical reviews are conducted as deliverables are produced. In essence, the reviews become SQA "exit conditions" for movement to the next CPF activities and tasks.

U.2.3 Conduct a formal technical review to uncover errors in a software engineering deliverable.

U.2.3.1 Assign a review leader to coordinate the review.

U.2.3.2 Distribute review materials [work product(s)] to reviewers.

U.2.3.2 Set the review agenda and select appropriate checklists.

U.2.3.3 Review work product(s) to uncover errors.

U.2.3.4 Conduct the review meeting.

U.2.3.5 Record any errors and/or issues that are raised during the review meeting.

U.2.3.6 Decide the outcome of the review.

U.2.3.7 Ensure that proper review records are kept.

U.2.3.8 Collect any defined metrics for the review.

Intent: The intent of this task is to uncover errors in any work product produced as a consequence of a software engineering task

Mechanics: See Pressman, R.S., A Manager’s Guide to Software Engineering,McGraw-Hill, 1993, p. 328 - 334.

Application of Formal Methods: walkthrough or inspection approach

Application of CASE Tools: t.b.d.

SQA Checklist:

Page 13: Umbrella Activities

1. Are reviews scheduled as part of the project timeline?

2. Have review team members prepared in advance?

3. Are written records maintained for each review?

Do’s & Don’ts

Do: Review small work products, not massive deliverables. The review should generally take no more that 60 - 90 minutes.

Do: Train team members in review technique and psychology before conducting reviews.

Do: Keep the tone light.

Don’t: Tolerate people who will not prepare in advance. They must.

Don’t: Review the person. Review the product.

Don’t: Let reviews turn into a personnel appraisal.

Helpful Hints

Reviews are actually a technical meeting. Therefore, if you avoid the problems that cause meeting to fail, you’ll avoid the problems that cause reviews to fail. Be sure that you have an agenda before starting the review. Be certain that you define a strong review leader. Take notes. Don’t let the review turn into a problem solving session. The purpose of a formal technical review is to uncover errors, not necessarily to resolve them.

Deliverables:

1. Technical review summary report

2. Issues list

U.2.4 Evaluate review results on a regular basis.

The project team should "debrief" on reviews regularly in an effort to improve the review approach.

Software Quality Assurance

U.3.1 Establish a Software Quality Plan for the project.

U.3.1.1 Assign SQA responsibilities to project staff and/or to a separate SQA organization.

U.3.1.2 Identify software engineering deliverables that will be evaluated for quality

U.3.1.3 Identify applicable standards, practices, conventions and measurements/metrics that will be adopted by the project team.

Page 14: Umbrella Activities

U.3.1.4 Define the 'deliverable set' for the project and define criteria for assessing quality for each deliverable.

U.3.1.5 Specify all SQA points throughout the software engineering process.

U.3.1.6 Define quality issues reporting scheme.

U.3.1.7 Define approach for formal technical reviews

See Tasks U.2.

Intent: The intent of this task is to develop a Software Quality Plan that identifies those activities to be conducted to ensure quality in deliverables produced in all CPF activities and software engineering tasks. The plan identifies both technical and management tasks. This task encompasses work performed as part of tasks U.3.2 through U.3.7 and work defined in SPM task U.1.14.

Mechanics: See Pressman, R.S., A Manager's Guide to Software Engineering,McGraw-Hill, 1993, Chapter 14 and IEEE Std. 730.1-1989.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist: none

Do's & Don'ts

Do: Recognize that many of Tasks U.3.1.1 through U.3.1.7 can be conducted once for an organization and then adapted to the specific needs of an individual project.

Do: Address SQA explicitly as part of your planning.

Do: Be sure to assign explicit responsibility for SQA tasks.

Don't: Make SQA overly bureaucratic or people will subvert the process.

Don't: Assume that SQA is the job of the independent SQA group. It's everyone's job.

Helpful Hints

The goal of SQA is to help a project team or an organization as a whole imporve its software process. It is for this reason that an independent SQA group (ISQA) and the software engineering team work together as a team, not as adversaries. It's important to instill this atmosphere.

Deliverables:

1. Software Quality Assurance Plan (may be one section in the Project Planfor small projects)

U.3.2 Establish a quality function deployment (QFD) scheme.

An effective QFD scheme incorporates mechanisms for customer definition of requirements, forward and backward traceability of value throughout the software engineering process and into production use, quality management that may incorporate use of measurement and statistical techniques.

U.3.3 Establish validation criteria for the software to be produced or modified.

Page 15: Umbrella Activities

Once requirements for the work have been determined, the team should identify a set of validation criteria that answers the question, "how would we recognize a successful implementation of the software if it were delivered tomorrow." These criteria form the basis for test planning and also guide the review process.

U.3.4 Conduct reviews and other SQA activities as defined in Task U.3.1.

U.3.4.1 Collect quality metrics for all deliverables, based on the results of formal technical reviews.

U.3.4.2 Conduct standards compliance audits (if applicable) to ensure that all applicable standards have been met.

U.3.4.3 Initiate management/customer quality briefings to report on quality issues as they are encountered and reconciled.

U.3.4.4 Perform subcontractor monitoring to ensure that the subcontractor complies with the Software Quality Plan.

U.3.4.5 Update the project plan/risk mitigation, monitoring and management plan based on quality issues uncovered in other tasks.

Intent: The intent of this task is to conduct a series of SQA activities as deliverables are produced as part of the APM activities and tasks.

Mechanics: See Pressman, R.S., A Manager's Guide to Software Engineering,McGraw-Hill, 1993, Chapter 14.

Application of Formal Methods: walkthrough or inspection approach

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Is the team recording error data that include type and cause as well as cost to correct?

2. Are audits conducted to ensure compliance with applicable standards and/or with the APM.

3. Have SQA requirements been defined for subcontractors?

4. Are SQA activities conducted according to plan?

5. Does management respond the SQA recommendations in a timely manner?

Do's & Don'ts

Do: Schedule SQA activities as part of the project plan.

Do: Measure quality.

Do: Conduct regular quality briefings to discuss what quality problems are being encountered and ways that quality can be improved.

Helpful Hints

Be sure that you initiate these quality activities as early in the project as possible. Do not wait until implementation activities to begin worrying about quality.

Deliverables:

Page 16: Umbrella Activities

1. Quality metrics

2. Quality compliance reports

3. Quality improvement recommendations

U.3.5 Audit the software process to ensure that all quality issues have been addressed and (where necessary) corrected.

Audits should be conducted to ensure that all quality issues have been reconciled before transition to the next software engineering (CPF) task.

Software Configuration Management

U.4.1 Select an appropriate set SCM tasks as part of the SCM plan developed in Task U.1.15.

U.4.1.1 Identify project SCIs and baselines.

U.4.1.2 Define the project database (repository) and the tools required to manage it.

U.4.1.3 Establish how the librarian function is to be accomplished.

U.4.1.4 Define the change control process for the project.

U.4.1.5 Identify change control authority (CCA) responsibility.

U.4.1.6 Establish SCM documentation requirements.

U.4.1.7 Define auditing and reporting requirements.

Intent: The intent of this task is to identify the change management approach to be used.

Mechanics: See Pressman, R.S., A Manager's Guide to Software Engineering,McGraw-Hill, 1993, Chapter 15 and IEEE Std. 828-1990.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Have all deliverables been identified as SCIs?

2. Have criteria for baselining been established and are these criteria followed?

3. Can you acquire an SCI from the project database easily?

4. Are changes requested in a uniform manner?

5. Is it possible to get a list of all changes made on a software project within one hour of requesting the list?

Page 17: Umbrella Activities

Do's & Don'ts

Do: Define the number of SCIs in a way that will enable you to balance the need for specific control against the complexity of large numbers of SCIs.

Don't Impose baselines too quickly. It makes the process overly bureaucratic.

Don't Assume that a code control tool is the equivalent of a complete SCM approach. It isn't.

Helpful Hints

Be sure that SCM activities begin early in the project. When changes are requested in one framework activity, and these changes will impact deliverables created in an earlier framework activity, it's likely that SCM tasks will be required to control change.

Deliverables: SCM Plan

U.4.2 Initiate change control process whenever a change is requested that may affect a baselined deliverable.

U.2.2.1 Accept a change request submitted to the team.

U.2.2.2 Evaluate change request.

U.2.2.3 Generate change report.

U.2.2.4 Evaluate change report (by CCA) to determine whether change should be made.

U.2.2.5 Generate engineering change order (ECO).

U.2.2.6 Queue the change for processing.

U.2.2.7 Make the change.

U.2.2.8 FTR/SQA: Review and audit the change.

U.2.2.9 Release the change.

Intent: The intent of this task is to make a change to a software deliverable in a controlled manner.

Mechanics: See Pressman, R.S., A Manager's Guide to Software Engineering,McGraw-Hill, 1993, Chapter 15 and IEEE Std. 828-1990.

Application of Formal Methods: none

Application of CASE Tools: t.b.d.

SQA Checklist:

1. Can each change be traced to a change request?

2. Can each change be traced to a change report?

3. Is each change evaluated by a CCA?

4. Can each change be traced to a ECO?

Page 18: Umbrella Activities

5. Are tasks similar to Tasks II, III or IV conducted when making the change?

Do's & Don'ts

Do: Stream line the change control process to keep the mean time to make a change as low as is possible. To do this, try to keep CCA decision-making as close to the people making the change as is possible.

Don't Allow customers to circumvent the process.

Don't Forget the SQA activities are part of the change control process.

Helpful Hints

Timing is everything in change control. Impose it too early and your project schedule suffers. Impose it too late and chaos reigns. Err on the side of imposing it too early, and at the same time work hard to keep the flow through the process moving at a rapid rate.

Deliverables:

1. Change request

2. Change report

3. Engineering change order (ECO)

4. Deliverables (SCIs) that have been changed

U.4.3 Record and report all changes to those individuals with a need to know.

Changes should be recorded by type and disposition. In additional project related metrics (e.g., effort, number of people, duration errors uncovered) should be collected for each change. These data should be published and reported to those with a need to know.

Document Preparation and Production

he following list of document preparation and production tasks provides a template for a typical APM project. Project managers must select the task set that is appropriate for a specific project.

IMPORTANT NOTE: Documentation preparation and production is conducted concurrently with other APM activities and tasks. In most cases, information developed as part of a software engineering task is used as 'input' for a documentation task.

U.5.1 Select the appropriate document outline for the software engineering information to be documented.

APM document outlines are presented in another section of this site.

Page 19: Umbrella Activities

U.5.2 Identify information created during software engineering (CPF) tasks and required for the appropriate document.

APM document outlines specify a variety of information that has likely been created as part of corresponding CPF tasks. The software engineer must identify this information and use the appropriate tools to transfer information to available desk-top publishing tools.

U.5.3 Develop additional document content.

Add additional connecting text and supplementary information as specified by the document outline.

U.5.4 Construct the document.

If reuse has been adopted as an organizational objective, the amount of 'new writing' may be kept to a minimum (and the time required for documentation may be minimized) by using reusable document fragments.

U.5.5 Review the completed document for correctness.

Conduct a formal technical review (or less formal review where appropriate) to assess the correctness of the document.

Reusability Management

U.6.1 Adapt a set of reusability criteria and a classification scheme for major project deliverables.

Reusability criteria and classification schemes should not be defined independently for for each project. Organizations should establish these criteria and a classification approach for global use. Each team should then adapt the reuse conventions for the project.

U.6.2 Define guidelines for creating reusable software components (data, documents and programs) and train managers and practitioners in their application.

A distinct set of guidelines (developed from the criteria developed in Task U.6.1) should be developed. Managers and practitioners should receive training so that they will be capable of implementing the guidelines.

U.6.3 Define reuse evaluation points (REP) on the project timeline.

Reusability must be evaluated as models are created and deliverables are produced. The team must establish a mechanism for accomplishing this.

U.6.4 Define candidate software components for entry into the reuse library or candidate components for extraction from the library.

Page 20: Umbrella Activities

As analysis and design models are created, the project team should evaluate data and functional components (or OO classes) to determine whether an existing component is available. For new components, the team must evaluate the component against reuse criteria for potential entry into the library.

U.6.5 Define specific SQA procedures for reusable components, prior to their entry into the library.

When software components have been selected for entry into the library, the project team and an independent SQA group must assess quality rigorously. This task must be performed for all candidate reusable components.

U.6.6 Classify reusable components and enter them into the library.

Using the classification scheme, the project team must define a set of classification parameters so that database searches of the library will uncover the component.

U.6.6.1 Define component by type (i.e., data, document, program) and subtype.

U.6.6.2 Define the component application domain, functional domain and technical domain.

U.6.6.3 Define implementation characteristics.

U.6.6.4 Attach reuse guidelines.

U.6.6.5 Submit for review prior to insertion in reuse library.

U.6.7 Define specific validation procedures for reusable components, subsequent to their extraction from the library.

Although reusable components are assumed to be validated prior to entry into the library, local validation within the context of the project scope should be planned and executed for each component. U.6.8 Product Reuse Status Report (RSR).

The Reuse Status Report indicates components that have been reused from the library and reusable components produced by the project team. The RSR is produced at the conclusion of the engineering modeling and implementation activities.

Measurement

U.7.1 Define the catalogue of project, product, and process metrics that are to be collected.

The catalog of metrics defines each aspect of the project, the product and the process that is to be measured. Measurement criteria, measurement points, and the analysis to be applied for each metric are also defined.

U.7.1.1 List all project, product and process metrics.

- schedule compliance metrics

- effort metrics

Page 21: Umbrella Activities

- quality metrics

- software size metrics

U.7.1.2 Define measurement points on project timeline.

U.7.1.3 Define metrics collection mechanisms.

U.7.1.4 Define metrics reporting mechanisms.

U.7.2 Define approach for analyzing metrics that have been collected.

Metrics will be analyzed at two levels. At the project levels, metrics can be used to help the team improve its local process and the product.

U.7.3 Collect metrics for the project, product, and process.

Metrics collection is an ongoing activity. A set of collection forms can be developed for this purpose.

U.7.4 Analyze metrics for the project, product and process.

Different metrics are analyzed by different people for different reasons. Project metrics are analyzed by the project manager to assess the efficacy of resource loading and schedule compliance. Product metrics are evaluated by both managers and practitioners. The manager assesses the overall quality of the deliverables that are produced. Practitioners use product metrics as a real-time quantitative check on the quality of their engineering work. Process metrics are used by managers to define incremental improvements in the APM.

U.7.5 Report analysis findings.

The results of metrics analysis are reported to a targeting audience that includes business management, project management and technologists. It is important to provide some training for all those who receive software metrics reports

Risk Management

U.8.1 Define technology and project risks.

Intent: The intent of this task is to review technology and project risks defined during the scoping phase of the project.

Mechanics: The project team (and customer) should meet to develop a list of risks for the project. Initially, no risks are rejected, no matter how remote.

Application of formal methods: Risk analysis is a formal procedure and is described in many of the risk analysis resources presented at this site.

Application of CASE Tools: t.b.d.

Page 22: Umbrella Activities

SQA Checklist: none

Do's & Don'ts

Do: Encourage participation by all team members.

Don't Discount any risk, not matter how remote.

Don't Specify a risk with a 100% likelihood of occurrence. It isn't a risk, it's a project constraint.

Helpful Hints

The best way to approach risk definition is to answer the question: "What can go wrong during this project?".

Deliverables: List of risks

U.8.2 Identify project risks associated with scope.

Intent: The intent of this task is to perform a formal risk assessment for the project. Both generic and technology specific risks are identified. In addition, the "cultural risks" associated with technology change may also be specified during this task.

Mechanics: Use risk checklists as well as risks suggested by the customer, the developer, and potential users.

Application of Formal Methods: risk identification checklists

Application of CASE Tools: t.b.d.

Do's & Don'ts

Do: List as many risks as possible. During this task, it's useful to have a pessimistic outlook.

Don't: Discard any risk, regardless of how far-fetched it might seem. [Later, low priority risks will be discarded].

Helpful Hints

1. Use risk checklists.

2. An important category of risk is the set of risks associated with reengineering legacy systems. Key areas of concern are interoperability and integration risks.

Deliverables: Categorized list of risks

U.8.3 Estimate the probability of occurrence for each risk.

Intent: The intent of this task is to estimate the probability of occurrence of each of the generic and technology-specific risks.

Mechanics: Risk probability is determined from past experience. All interested constituencies are polled to estimate the probability for each risk.

Page 23: Umbrella Activities

Application of Formal Methods: see Risk Assessment references

Application of CASE Tools: t.b.d.

Do's & Don'ts

Do: Express probability using either a quantitative estimate (e.g., 70% probable) or if there is less certainty, a quantitative scale (e.g., high, medium, low).

Don't: Discard any risk, even if it's probability is quite low. [Later, low priority risks will be discarded].

Helpful Hints

In determining probability, use a percentage scale defined in 10% increments. A high probability risk is greater than 80% probable. A low probability risk is less than 30% probable. Information developed in Tasks U.8.1 through U.8.4 can be placed in a risk table.

Deliverables: Categorized list of risks with probabilities attached

U.8.4 Estimate the project impact of each risk, should it occur.

Intent: The intent of this task is to estimate the impact on project effort, schedule and budget that will result if one or more generic or technology-specific risks should occur.

Mechanics: Risk impact is determined from past experience. All interested constituencies are polled to estimate the impact for each risk.

Application of Formal Methods: see Risk Assessment references

Application of CASE Tools: t.b.d.

Do's & Don'ts

Do: Express impact in terms of project planning parameters, i.e., impact on schedule, impact on effort required to complete the project, impact on project budget.

Don't: Discard any risk, even if it's impact is quite low. [Later, low priority risks will be discarded].

Helpful Hints

1. See Risk Assessment references

2. Ideally, impact should be quantitative and expressed in terms of time, effort, or dollars. Alternatively, an impact scale (e.g., 1 to 5) can be used to indicate relative severity of impact).

Deliverables: Categorized list of risks with probabilities and impacts attached

U.8.5 Develop a list of prioritized technology risks.

Intent: The intent of this task is to sort the list created in Task U.8.2 through U.8.4 by probability, then by impact.

Mechanics: Sort the list, define a "priority cut-off," and discard those risks that fall below it.

Page 24: Umbrella Activities

Application of Formal Methods: see Risk Assessment references

Application of CASE Tools: t.b.d.

Do's & Don'ts

Do: Establish a risk cut-off level for project impact and probability. The cut-off level is the point at which the probability and/or impact is too low to warrant serious concern.

Don't: Disregard "gut feel." Even if a particular risk falls below the cut-off, it may be judicious to leave it on the list.

Helpful Hints

Worrying about every conceivable risk will result in a diluted approach to risk management. Develop the risk cut-off and focus on the risks that fall above the line.

Deliverables: Adjusted list of risks with probabilities and impacts attached

U.8.6 Indicate a plan for technology risk mitigation, monitoring and management (i.e., a contingency plan) and update project planning information to reflect risks.

U.8.7 Review risks with customer. Risk analysis is a formal procedure and is described in many of the risk analysis resources presented at this site.

U.8.8 Revise project plan, if required.

Intent: The intent of these tasks is to create a risk mitigation, monitoring and management plan (RM3P) that is appropriate in size and detail to the project category. A RM3P should identify the way in which the developer will avoid risks, monitor each of the factors that will cause a risk to become real, and define how the risk will be handled if it should occur. In addition, risks are reviewed with the customer and the Project Plan developed inTask U.1 may be updated to reflect risks.

Mechanics: Concrete steps for mitigating and monitoring each risk are indicated and contingency planning is specified. The risks and contingency plan are presented to the customer for review and comment. If required, the project plan is modified to reflect all risks.

Application of Formal Methods: see risk analysis resources

Application of CASE Tools: t.b.d.

Do's & Don'ts

Do: Be as specific as possible in indicating how risks will be monitored and what (specifically) will be done should one or more risks occur. Be prepared!

Do: Present risks to both management and the customer. Insist that a "go-no-go" decision be made, given the potential impact of the risks.

Don't: Think that it's necessary to write a voluminous document to satisfy the intent of this task. For casual or semi-formal projects (and for many quick reaction projects), the RM3P may be the minutes of a brief meeting that considers risk.

Deliverables:

1. Risk Mitigation, Monitoring and Management Plan

Page 25: Umbrella Activities

2. Customer comments on risk

3. Revised Software Project Plan, if required.