62
Evolving a Measurement Program Evolving a Measurement Program for Systems and Software for Systems and Software Engineering Process Improvement Engineering Process Improvement CMMI 2003 CMMI 2003 Denver, Colorado Denver, Colorado

Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Embed Size (px)

Citation preview

Page 1: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Measurement Program for Evolving a Measurement Program for Systems and Software Engineering Systems and Software Engineering

Process ImprovementProcess Improvement

CMMI 2003CMMI 2003Denver, ColoradoDenver, Colorado

Page 2: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 2version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

WelcomeWelcome

Wilkommen

Bienvenido

WelKom

Bienvenue

BienvenutoVälkommen

Tervetuloa Witamy

Huan Yín

ЌАΛΟΣ ΟΡΙΣΑΤΕ

ようこそ

Page 3: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 3version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measurement Measurement and Analysisand Analysis

Page 4: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 4version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measurement and Measurement and Analysis OverviewAnalysis Overview

A measurement initiative involves the following: Specifying the objectives of measurement and

analysis such that they are aligned with established information needs and business objectives

Defining the measures to be used, the data collection process, the storage mechanisms, the analysis processes, the reporting processes, and the feedback processes

Implementing the collection, storage, analysis, and presentation of the data

Providing objective results that can be used in making business judgments and taking appropriate corrective actions

Page 5: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 5version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Establish Measurement Establish Measurement ObjectivesObjectives

Measurement objectives Document the purposes for which measurement

and analysis is done (GQM) Specify the kinds of actions that may be taken

based on the results of the data analyses Continually ask the question – what value will this

measurement be to those people who will be asked to supply the raw measurement data and who will receive the analyzed results

Involve the end users of the measurement and analysis results in setting measurement objectives whenever possible

Maintain traceability of the proposed measurement objectives to the information needs and business objectives

Page 6: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 6version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Specify MeasuresSpecify Measures

Identify candidate measures based on documented objectives and refine into precisely defined, quantifiable measures

Define how data can and will be derived from other measures

Examples of Commonly Used Derived Measures Defect density

Peer review coverage

Reliability measures

Page 7: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 7version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Specify Data Collection Specify Data Collection and Storage Proceduresand Storage Procedures

Specify how to collect and store the data for each required measure

Make explicit specifications of how, where, and when the data will be collected

Develop procedures for ensuring that the data collected is valid data

Ensure that the data is stored such that it is easily accessed, retrieved, and restored as needed

Page 8: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 8version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Specify Analysis Specify Analysis ProceduresProcedures

Define the analysis procedures in advance

Ensure that the results that will be fed back are understandable and easily interpretable Collecting data for the sake of showing an

assessor the data is worthless Showing how it can be used to manage and control

the project is what counts

Page 9: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 9version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Collect Collect Measurement DataMeasurement Data

Collect the measurement data as defined, at the points in the process that were agreed to, according to the time scale established

Generate data for derived measures

Perform integrity checks as close to the source of the data as possible

Page 10: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 10version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Analyze the Analyze the Measurement DataMeasurement Data

Conduct the initial analyses

Interpret the results and make preliminary conclusions from explicitly stated criteria

Conduct additional measurement and analyses passes as necessary to gain confidence in the results

Review the initial results with all stakeholders Prevents misunderstandings and rework

Improve measurement definitions, data collection procedures, analyses techniques as needed to ensure meaningful results that support business objectives

Page 11: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 11version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Store the Measurement Store the Measurement Data and Analyses Data and Analyses ResultsResults

The stored information should contain or reference the information needed to: Understand the measures Assess them for reasonableness and applicability

The stored information should also: Enable the timely and cost effective future use of

the historical data and results Provide sufficient context for interpretation of the

data, measurement criteria, and analyses results

Page 12: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 12version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Communicate the Communicate the Measurement ResultsMeasurement Results

Keep the relevant stakeholders up-to-date about measurement results on a timely basis

Follow up with those who need to know the results Increases the likelihood that the reports will be

used

Assist the relevant stakeholders in understanding and interpreting the measurement results

Page 13: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 13version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measurement and Measurement and Analysis GroupAnalysis Group

Consider creating a measurement group that is responsible for supporting the Measurement and Analysis activities of multiple projects

Page 14: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 14version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measurement and Measurement and Analysis ToolsAnalysis Tools

Incorporate tools used in performing Measurement and Analysis activities such as: Statistical packages Database packages Spreadsheet programs Graphical packages Packages that support data collection over

networks and the internet

Page 15: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 15version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measurement and Measurement and Analysis TrainingAnalysis Training

Provide training to all people who will perform or support the Measurement and Analysis process Data collection, analyses, and reporting processes Measurement tools Goal-Question-Metric Paradigm How to establish measures

how to determine efficiency and effectiveness Quality factors measures (e.g., maintainability,

expandability) Basic and advanced statistical techniques

Page 16: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 16version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic MeasuresBasic Measures

Page 17: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 17version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic MeasuresBasic Measures

Estimate Size and/or Complexity - a relative level of difficulty or complexity should be assigned for each size attribute

Examples of attributes to estimate for Systems Engineering include: Number of logic gates

Number of interfaces

Examples of size measurements for Software Engineering include: Function Points Lines of Code Number of requirements

Page 18: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 18version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic Measures - 2Basic Measures - 2

Determine effort and cost – use historical data or models

Establish the project’s schedule based on the size and complexity estimations

Include, or at least consider, infrastructure needs such as critical computer resources

Identify risks associated with the cost, resources, schedule, and technical aspects of the project

Control data (various forms of documentation) required to support a project in all of its areas.

Page 19: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 19version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic Measures – 3Basic Measures – 3

Identify the knowledge and skills needed to perform the project according to the estimates

Select and implement methods for providing the necessary knowledge and skills Training (Internal and External) Mentoring Coaching On-the-job application of learned skills

Monitor staffing needs – based on effort required and the necessary knowledge and skills to achieve the defined tasks

Page 20: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 20version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic Measures - 4 Basic Measures - 4

Involve relevant stakeholders throughout the product lifecycle

Track technical performance (Completion of activities and milestones against the schedule Example: Product components designed, constructed, unit

tested and integratedCompare actual milestones completed vs.

established commitments

Monitor commitments and critical dependencies against those documented in the project plan

Page 21: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 21version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Basic Measures - 5Basic Measures - 5

Track quality – Problems/Defects (open/closed by product/activity) Problems and defects are direct contributors to

the amount of rework that must be performed—a significant cost factor in development and maintenance

Page 22: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 22version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Effectiveness of Effectiveness of ProcessesProcesses

Page 23: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 23version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Effectiveness of Effectiveness of ProcessesProcesses

In addition to defining the processes that we wish to follow on our project, we need to ensure we are following them and we should be able to determine if the processes are working for us the way we expected them to How well are the processes working?

Page 24: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 24version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Requirements Requirements Management – Management – Effectiveness ExampleEffectiveness Example

Number of change requests per month compared with the original number of requirements for the project Critical change requests Intermediate change requests Nice to have change requests

Time spent on change requests up until a Y/N decision is given from the Senior Contract group

Number and size of critical change requests that arise after the requirements phase has been completed

Page 25: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 25version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Set of Standard Set of Standard Organizational Organizational

ProcessesProcesses

Page 26: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 26version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Importance of an Importance of an Organizational View of Organizational View of ProcessesProcesses

Builds a common vocabularyAllows others to anticipate behavior and be

more proactive in their interactionsAllows the organization to measure a

controlled set of processes to gain economy of scale

Trends can be seen and predictability can be achieved

Process performance baselines can be developed to support quantitative management later

Page 27: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 27version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Organizational Organizational Measurement Measurement RepositoryRepository

Develop an organization measurement repository - include: Product and process measures that are related to

the organization’s set of standard processes

The related information needed to understand and interpret the measurement data and asses it for reasonableness and applicability

Develop operational definitions for the measures to specify the point in the process where the data will be collected and for the procedures for collecting valid data

Page 28: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 28version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Slightly More Slightly More Advanced MeasuresAdvanced Measures

Page 29: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 29version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Peer ReviewsPeer Reviews

Page 30: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 30version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Peer ReviewPeer ReviewMeasuresMeasures

Optimum Checking Rate (e.g., Number of pages to be checked per hour)

Logging Rate (e.g., Number of major and defects logged per hour)

Number of Major and Minor Defects

Effectiveness - Number of Major Defects found in this stage compared to the total number of defects found so far

Page 31: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 31version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

TestingTesting

Page 32: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 32version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Defects Discovered Defects Discovered During TestingDuring Testing

Effectiveness - Number of Major defects found in a particular testing phase or instantiation of a testing phase compared to the total number of defects found during testing

Number of defects projected to escape from the current testing phase

Page 33: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 33version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Test CoverageTest Coverage

Statement coverage measures whether each executable statement is encountered

Block coverage is the same as statement coverage except that the unit of code measured is each sequence of non-branching statements

Decision coverage measures whether boolean expressions tested in control structures (such as if-statements or while-statements) evaluated to both true and false

Condition coverage measures the true or false outcome of each boolean sub-expression

Page 34: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 34version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Test Coverage - 2Test Coverage - 2

Function Coverage Measures whether you invoked each function or

procedureCall Coverage

Measures whether you executed each function call Based on the idea that faults commonly occur in

interfaces between modulesLoop Coverage

Measures whether you executed each loop body zero times, exactly once, and more than once

Relational Operator Coverage Measures whether boundary situations occur with

relational operators (<, <=, >, >=)

Page 35: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 35version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Risk ManagementRisk Management

Page 36: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 36version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLCImpact

Pro

bab

ilit

y

Time

Prioritizing RisksPrioritizing Risks

Priorities are established by merging the information available into a three dimensional table

Page 37: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 37version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Prioritizing - 2Prioritizing - 2

CatastrophicCatastrophic

CriticalCritical

MarginalMarginal

NegligibleNegligible

ProbableProbable PossiblePossible UnlikelyUnlikely

TopTop

HighHigh

SeriousSerious

MediumMedium

HighHigh

SeriousSerious

MediumMedium

LowLow

SeriousSerious

MediumMedium

LowLow

BottomBottom

Probability:

Impact:

Page 38: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 38version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Prioritizing - 3Prioritizing - 3

ImmediateImmediate ShortShort MediumMedium

11

22

33

44

22

33

44

55

33

44

55

66

55 66 77

66 77 88

TopTop

HighHigh

SeriousSerious

MediumMedium

LowLow

BottomBottom

LongLong

44

55

66

77

88

99

TimeFrame:

Page 39: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 39version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Quantitative Project Quantitative Project ManagementManagement

Page 40: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 40version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Quantitative Quantitative Management ConceptsManagement Concepts

Quantitative Management is tied to the organization’s strategic goals for product quality, service quality, and process performance

When higher degrees of quality and performance are demanded, the organization and projects must determine if they have the ability to improve the necessary processes to satisfy the increased demands

Page 41: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 41version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Quantitative Quantitative Management Concepts - Management Concepts - 22

Achieving the necessary quality and process performance objectives requires stabilizing the processes that contribute most to the achievement of the objectives Cost Schedule Performance Quality

Assuming the technical requirements can be met, the next decision is to determine if it is cost effective

Reducing process variation is an important aspect to quantitative management

Page 42: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 42version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Establish the Organization’s Establish the Organization’s Process PerformanceProcess PerformanceBaselineBaseline

Process performance is a measure of the actual results achieved by following a process

Measure the organization’s process performance for the organization’s set of standard processes at various levels including: Individual processes (e.g., test case inspection

element)

Sequence of connected processes

Processes for the complete lifecycle

Processes for developing individual work products

Page 43: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 43version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Establish the Organization’s Establish the Organization’s Process PerformanceProcess PerformanceBaselineBaseline - 2 - 2

There may be several process performance baselines to characterize performance for subgroups of the organization – Examples include: Product Line

Application domain

Complexity

Team size

Work product size

Page 44: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 44version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Steps to Establish Project’s Steps to Establish Project’s Quality and Process Quality and Process Performance ObjectivesPerformance Objectives

Identify the quality and process performance needs and priorities of the customer, end users, and other relevant stakeholders Reliability Maintainability and Expandability Usability Development cycle time

Determine how quality and process performance will be measured

Page 45: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 45version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Steps to Establish Project’s Steps to Establish Project’s Quality and Process Quality and Process Performance Objectives - 2Performance Objectives - 2

Define and document measurable quality and process performance objectives for the project

Examples of Quality Objectives Mean time between failures Critical resource utilization

Examples of Process Performance Objectives Percentage of defects removed by type of

verification activity Defect escape rates Number and density of defects (by severity) found

during the first year following product delivery Rework time as a percentage of total project life-

cycle time

Page 46: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 46version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Steps to Establish Project’s Steps to Establish Project’s Quality and Process Quality and Process Performance Objectives - 3Performance Objectives - 3

Establish objectives for each life-cycle stage to track progress and predict achievement of project’s objectives

Establish criteria to identify which subprocesses are the main contributors to achieving the identified quality and process performance objectives and for which predictable performance is important

Page 47: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 47version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Measures Measures and Analytic and Analytic TechniquesTechniques

Page 48: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 48version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Select Measures and Select Measures and Analytic TechniquesAnalytic Techniques

Identify the measures that are appropriate for statistical management

Examples of subprocess control measures include: Ratios of estimated to measured values of the

planning parameters Coverage and efficiency of work product

inspections Test coverage and efficiency Reliability, Maintainability, and Expandability Percentage of the total defects inserted or found in

the different stages of the lifecycle

Page 49: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 49version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Select Measures and Select Measures and Analytic Techniques - 3Analytic Techniques - 3

Specify the operational definitions of the measures, their collection points in the subprocesses and how the measures will be validated

State specific target measures or ranges to be met for each measured attribute of each selected process

Set up the organizational support environment to support the collection and analysis of statistical measures

Identify the appropriate statistical and quantitative management analysis techniques that are expected to be useful in statistically managing the selected subprocesses

Page 50: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 50version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Quantitative ToolsQuantitative Tools

Quantitative tools considered to be applicable to quantitative management or statistical process control Scatter diagrams Run charts Cause-and-effect (fishbone) diagrams Histograms Bar charts Pareto charts Control charts

Page 51: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 51version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Control ChartsControl Charts

Control charts – techniques for quantifying process behavior Allows a project team to monitor, control and

improve process performance over time by studying variation and its source

Distinguishes special or assignable causes of variation from common causes of variation as a guide for management decision making

Serves as a tool for ongoing control of a process Helps improve a process to perform consistently

and predictably for higher quality, lower cost, and higher effective capacity

Defines the “Voice of the Process”

Brassard Michael, Ritter, Diane, .The Memory Jogger II, , 1994

Page 52: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 52version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Control Charts - 2Control Charts - 2

Control charts are most effective when they are used within the broader context of established goals and the activities performed to achieve those goals

Classical control charts have a centerline and control limits on both sides of the centerline

Both the centerline and the limits represent estimates that are calculated from a set of observations collected while the process is running

The centerline and control limits cannot be assigned arbitrarily as they are intended to show what the process can actually do

Florac, W.A. & Carleton, A.D. Measuring the Software Process Addison-Wesley, 1999

Page 53: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 53version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

 

Numerical data takenin time sequence

Numerical data takenin time sequence

 

                             

 

 

                             

                             

                             

                             

 

UpperControl

Limit(UCL)

LowerControl

Limit(LCL)

Upper andLower

Control Limitsrepresent the

natural variationIn the process

Upper andLower

Control Limitsrepresent the

natural variationIn the process

METRIC:PROCESS CONTROL CHART TYPE:

A point above or below thecontrol lines suggests that the

measurement has a specialpreventable or removable cause

A point above or below thecontrol lines suggests that the

measurement has a specialpreventable or removable cause

Plotted points are eitherindividual measurements or the

means of small groups ofmeasurements

Plotted points are eitherindividual measurements or the

means of small groups ofmeasurements

The chart is used for continuous and time control of the process

and prevention of causes

The chart is used for continuous and time control of the process

and prevention of causes

The chart is analyzed using standard Rules to define the

control status of the process

The chart is analyzed using standard Rules to define the

control status of the process

Datarelating to

the process

Datarelating to

the process

Center Line (CL)(Mean of data used toset up the chart)

Statistical Methods for Software QualityAdrian Burr – Mal Owen, 1996

Page 54: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 54version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

HistogramsHistograms

Histograms Allows a project team to take measurement data

and display the distribution of the observed values Show the frequencies of events that have occurred

in ways that make it easy to compare distributions and see central tendencies

Illustrates quickly the underlying distribution of the data

Helps indicate if there has been a change in the process

Florac, W.A. & Carleton, A.D. Measuring the Software Process Addison-Wesley, 1999

Page 55: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 55version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Histograms - 2Histograms - 2

Histograms - continued Provides useful information for predicting future

performance of the process Helps answer the question “Is the process capable

of meeting my customers requirements?” Histograms are created by grouping the results into

“cells” and then counting the number in each cell

Florac, W.A. & Carleton, A.D. Measuring the Software Process Addison-Wesley, 1999

Page 56: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 56version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Nu

mb

er o

f D

ays

Product – Service Staff Hours

20

56

18

16

1412

10

8

6

4

2

032 5452504846444240383634

Histograms - 3Histograms - 3

Page 57: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 57version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Applying Statistical Applying Statistical Methods to Methods to

Understand VariationUnderstand Variation

Page 58: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 58version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Understanding VariationUnderstanding Variation

Understanding variation is achieved by collecting and analyzing process and product measures so that special causes of variation can be identified and addressed to achieve predictable performance

All characteristics of processes and products display variation when measured over time

Variation may be due to Special or “assignable” causes of variation

Natural or common causes

Page 59: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 59version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

SummarySummary

Evolving a Measurement Program for Systems / Software Engineering Process Improvement includes: Clearly defining the need for a measurement

program Establishing a measurement initiative with

objectives that are aligned with established information needs and business objectives

Ensuring basic measures are included for planning, tracking, and taking corrective action as necessary

Incorporating process effectiveness measures

Establishing organizational standard processes

Page 60: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 60version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Summary - 2Summary - 2

Establish and utilize measures such as peer review measures, testing measures, and risk management measures

Evolve into project management based on a quantitative understanding of the organization’s and project’s defined processes

Page 61: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 61version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Thank YouThank You

Wilkommen

Bienvenido

WelKom

Bienvenue

BienvenutoVälkommen

Tervetuloa Witamy

Huan Yín

ЌАΛΟΣ ΟΡΙΣΑΤΕ

ようこそ

Page 62: Evolving a Measurement Program for Systems and Software Engineering Process Improvement CMMI 2003 Denver, Colorado

Evolving a Meas Pgm - 62version 2.1 – CMMI CONF© 2003 Kasse Initiatives, LLC

Kasse InitiativesKasse InitiativesContact InformationContact Information

Kasse Initiatives LLCPMB 2931900 Preston Road, Suite 267Plano, Texas 75093972 – 987 – 7601 Office972 – 987 – 7607 FAXwww.kasseinitiatives.com