41
Thomas L. Gilchrist [email protected] Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected] Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Embed Size (px)

Citation preview

Page 1: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Testing Basics

Set 4: Strategies & Metrics

By

Thomas L. Gilchrist, 2009

Page 2: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

What are the Tradeoffs

…changes in one require compensation in others...

Page 3: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Tradeoffs

• Schedule• Cost/Resources• Scope• Technology• Quality / Critical Success Factors

Page 4: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

How much testing?

…risk!...

Page 5: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Testing and Risk

The degree of risk increases

The

deg

ree

of t

estin

g

Page 6: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

High

Low

High

Low

Where are you now?

Level of testing

Confidence in process

TREND

Testing and Risk

Page 7: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Testing Goals

• Testing is about finding errors.• If you don’t find any errors,

testing is a waste of time.

Page 8: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Testing and Risk

Maturity of the development process

The

cos

t to

fin

d an

err

or/b

ug

Page 9: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Definitions

• Requirement– A condition or capability needed by

a user to solve a problem or achieve an objective

– A condition or capability that must be met or possessed by a system …to satisfy a contract, standard, specification or other formally imposed document.

Page 10: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Test Terms

• Test Categories– Unit Testing– Integration Testing– System Testing– Acceptance Testing– Regression Testing

• Static Testing• Dynamic Testing

Page 11: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

The Test Process

Page 12: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

The Testing Process

• Step 1 - Set Test Objectives• Step 2 - Develop Test Plan• Step 3 - Execute Tests• Step 4 - Summarize and Report

Results

Page 13: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

General Workbench

InputSource

Materials

ProcessProduct

ENTRY

EXIT

Standards,templates, rulesand checklists

DoWork

Rework

Testing

Page 14: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Testing Workbench

Test Process

Products for Test

Problems

Tested Products

ENTRY

EXIT

Test Toolbox

Test QC

Test Measures

Page 15: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

The Software Test Process

Step 1 - Develop Test Objectives Task 1 - Organize Test Team Task 2 - Perform Risk Assessment Task 3 - Set Test Objectives Task 4 - Quality Control

Step 2 - Develop Test Plan Task 1 - Define Business Functions Task 2 - Define Structural Functions Task 3 - Define Tests Task 4 - Create Function/Test Matrix Task 5 - Admin Test Requirements Task 6 - Formalize System Test Plan Task 7 - System Test Plan QC Task 8 - Develop Unit Test Plan Task 9 - Unit Test Plan QC

Step 3 - Execute Tests Task 1 - Select Test Tools Task 2 - Develop Test Cases Task 3 - Execute Tests Task 4 - Quality Control

Step 4 - Summarize and Report Results Task 1 - Record Defects Task 2 - Perform Data Reduction Task 3 - Develop Findings Task 4 - Formalize Report Task 5 - Quality Control

Page 16: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Test Process “V” Diagram

BusinessNeed

DefineRequirements

DesignSystem

BuildSystem

Verify

AcceptanceTest

SystemTest

IntegrationTest

UnitTest

Verify

Verify

Verify

Validate

Validate

Validate

Validate

Validates

Validates

Validates

Validates

TestPlanning

TestObjectives

TestExecution(Static) Test

Execution(Dynamic) &Test Reporting

Page 17: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

The SEICMM

Page 18: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Immature Vs Mature Organization

Immature Mature

Heroic Efforts of Dedicated Team

Repeating Proven Methods of an Organization

Not Managed Managed

People

Management

Technology

Process

Not Fully UtilizedOrganization

al Focus

No "Memory", Improvised

Repeatable, Followed,

Predictable

From: SEI CMM/Bill Curtis, et al.

Page 19: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

CMM Architecture

Level Key Process Areas

1Initial

Heroes*

5Optimizing

SystemCQI

Defect PreventionTechnology InnovationProcess Change Management

4Managed

SystemMeasurement

Process Measurement and AnalysisQuality Management

3Defined

EngineeringTechnicalPractices

Organizational Process FocusOrganizational Process DefinitionPeer ReviewsTraining ProgramIntergroup CoordinationSoftware Product EngineeringIntegrated Software Management

2Repeatable

ProjectManagement

Software Project PlanningSoftware Project TrackingSoftware Subcontract ManagementSoftware Quality AssuranceSoftware Configuration ManagementRequirements Management

Process Focus

From: SEI CMM/Bill Curtis, et al.

Page 20: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Taking Action Based

On Data

Page 21: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Measurement

• What’s you least favorite measurement?

• What’s your favorite measurement?

Page 22: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Designing a Metric Program

G

Q

M* From "A Methodology for Collecting Valid Software Engineering Data"by Victor Basili and David Weiss. IEEE Transactions of Software Engineering,Vol. SE-10, No. 4, Nov. 1984

Page 23: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Designing a Metric Program

Goal

Q

M* From "A Methodology for Collecting Valid Software Engineering Data"by Victor Basili and David Weiss. IEEE Transactions of Software Engineering,Vol. SE-10, No. 4, Nov. 1984

Page 24: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Designing a Metric Program

Goal

Question

M* From "A Methodology for Collecting Valid Software Engineering Data"by Victor Basili and David Weiss. IEEE Transactions of Software Engineering,Vol. SE-10, No. 4, Nov. 1984

Page 25: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Designing a Metric Program

Goal

Question

Metric* From "A Methodology for Collecting Valid Software Engineering Data"by Victor Basili and David Weiss. IEEE Transactions of Software Engineering,Vol. SE-10, No. 4, Nov. 1984

Page 26: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Designing a Metric Program

Goal

Question

Metric* From "A Methodology for Collecting Valid Software Engineering Data"by Victor Basili and David Weiss. IEEE Transactions of Software Engineering,Vol. SE-10, No. 4, Nov. 1984

Page 27: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Understanding Metric Data

• Data is not information!

DataData InformationInformation

From: “Understanding Variation”, D.J. Wheeler

Page 28: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Understanding Metric Data

• Data is not information!

DataData InformationInformationAnalysisAnalysis

From: “Understanding Variation”, D.J. Wheeler

Page 29: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Key Variation Concepts

• Variation exists everywhere• Two types

– Common causes (noise)– Special case (signal)

• Variation, waste, & predictability.

94% of all problems in an organization are due to common causes and only management can fix them.

(Deming, “Out of Crisis”)

Page 30: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Run Charts

Metric

Time

Page 31: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Specification Targets

Upper Specification Limit

Lower SpecificationLimit

Metric

Time

Voice of the customer

Page 32: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Shewhart’s Control Charts

• Every data set has noise.• “Noise” confuses and clouds

single value comparisons.• Some data sets contain signal

(special causes).• Control charts separate

potential signals from noise.

Page 33: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Control Chart

Metric

Time

Voice of the process

Page 34: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Control Chart

Metric

Time

Voice of the process

Upper Control Limit

Lower Control Limit

Page 35: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Signal and Noise

Upper Control Limit

Lower Control Limit

Metric

Time

Voice of the process"Special Cause"

"Common Cause"Region

Data excursions

Page 36: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Serial Correlation

Metric

Time

Voice of the process

Upper Control Limit

Lower Control Limit

Page 37: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Recommendations

• Improve measurement systems– Commit to measurement, not

guessing– Devise and use common

measurement tools and definitions– Train data collectors on metric

definitions

• Reduce process variation• Improve process results

Page 38: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Variation In Software

• In software, when one process has wide variation, more than likely, they all have have wide variation.

• High precision in software measurement is unnecessary at this time.

Page 39: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Suboptimization

Suboptimization occurs when one doesn’t think of the total system...it is about dependencies.

Page 40: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

X+Y=1

Suboptimization

X+Y=1

Suboptimization occurs when one doesn’t think of the total system...it is about dependencies.

Page 41: Thomas L. Gilchrist tomg@tomgtomg.com Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009

Thomas L. Gilchrist [email protected]

Suboptimization

X is big and Y is big = goodnessHow do we allocate resources between x and Y?What are some examples?

Suboptimization occurs when one doesn’t think of the total system...it is about dependencies.

X+Y=1X+Y=1