46
Avaliação de Desempenho de Sistemas de Informação MSc. Luiz Barboza [email protected] http://barbozaluiz.blogsp ot.com/

1 Ads

  • Upload
    lcbj

  • View
    531

  • Download
    0

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: 1 Ads

Avaliação de Desempenho de Sistemas de Informação

MSc. Luiz Barboza

[email protected]

http://barbozaluiz.blogspot.com/

Page 2: 1 Ads

Sobre mim... Mestre em Ciência da Computação com 10 anos de experiência de

mercado, atuando como Arquiteto de Software, com 04 anos direcionados para a área Financeira e com os 03 últimos focados na Garantia de Qualidade para o setor de Telecomunicações.

Qualificação Mestre em Engenharia de Software pelo IPT/USP MBA em Gestão Empresarial pela FGV Especialista em Gestão de TI pela FIAP Bacharel em Ciência da Computação pela UFPE

Certificações SCEA - Sun Certified Enterprise Architect TIBCO Enterprise Message Service Certified ITIL - ITIL Foundation Certified Professional IBM/Rational Specialist for Rational Requirements Management with Use

Cases(+ReqPro) IBM/Rational Certified Solution Designer - IBM Rational Unified Process V7.0 IBM/Rational Solution Designer – Object Oriented Analysis and Design(+Rose) SCWCD - Sun Certified Web Component Developer for the J2EE SCPJ - Sun Certified Programmer for Java 2 Platform

Page 3: 1 Ads

Programação Ementa

Aborda noção de carga de trabalho ("workload") e a sua caracterização; técnicas de modelagem de sistemas; ferramentas e metodologias para obtenção de dados de sistemas; medidas de desempenho: orientadas a usuário e orientadas ao sistema; ferramentas de medidas de desempenho: monitores de hardware e software;paradigmas("benchmarks"), simulação de sistemas e estudo de casos.

Objetivos Estudar conceitos e técnicas para dimensionar ou avaliar sistemas de

informação. Aplica ferramentas para a avaliação desses sistemas e os resultados da análise de desempenho para a sua otimização.

Bibliografia CHWIF, L., MEDINA, A. C. Modelagem e Simulação de Eventos

Discretos: Teoria e Aplicações. 1ª ed. São Paulo: Bravarte, 2006. TARTUGNO, A. F., DiPASQUALE, T. R., MATTHEWS, R. E. It Services:

Costs, Metrics, Benchmarking and Marketing. Safari Books online, Prentice Hall, 2000.

LOUKIDES, M., MUSUMECI, G. D. System Performance Tuning. 2a ed, Safari Books online, O´Reilly, 2002.   

JAIN, J. The art of computer systems performance analysis. Nova York: John Wiley & Sons, 1991.

Avaliação 2 Exames individuais discursivos.

Page 4: 1 Ads

Agenda

Princípios de análise desempenho Planejamento e Preparação para o Teste de

Performance Stress Testes Monitoramento aplicação desempenho Análise de desempenho do trafego de rede Analise e Ajuste de Performance na camada Web Análise de Desempenho de código Analise da camada de dados Estimativa de Capacidade Modelagem de Desempenho

Page 5: 1 Ads

Test: Simple Workflow

Tester

Run

Test Log

Tester

Code

Test Scripts

Test Designer

Model

Test Case

Test Manager

Plan

Test Plan

Test Analist

Report

Defect Log

Test Management Tool

Test Management ToolTest Script Tool

Page 6: 1 Ads

Workload Model

Page 7: 1 Ads

Requisitos de Performance

Page 8: 1 Ads

ACT

Page 9: 1 Ads

JMeter

Page 10: 1 Ads

RPT

Page 11: 1 Ads

RUP Overview

Focused on the Test Discipline

Page 12: 1 Ads

RUP Structure

Organization along time Lifecycle structure: phases, iterations Process enactment: planning, executing Activity management, project control

Organization based on content Disciplines, roles, artifacts, activities Process configuration, process enhancement

Page 13: 1 Ads

Organization Along Time

Page 14: 1 Ads

Phases and Iterations

Commit resources for the Commit resources for the elaboration phase elaboration phase

Commit resources Commit resources for constructionfor construction

Product sufficiently mature for Product sufficiently mature for customers to usecustomers to use

(Understand the problem)(Understand the problem) (Understand the solution)(Understand the solution) (Have a solution)(Have a solution)

AcceptanceAcceptanceor end of lifeor end of life

Planned (Business) Decision PointsPlanned (Business) Decision Points

PreliminaryPreliminaryIterationIteration

Architect.Architect.IterationIteration

Architect.Architect.IterationIteration

Devel. Devel. IterationIteration

Devel. Devel. IterationIteration

Devel. Devel. IterationIteration

TransitionTransitionIterationIteration

TransitionTransitionIterationIteration

Planned (Technical) Visibility PointsPlanned (Technical) Visibility Points

InceptionInception ElaborationElaboration ConstructionConstruction TransitionTransition

Page 15: 1 Ads

Major Milestones: Business Decision Points

InceptionInception ElaborationElaboration ConstructionConstruction TransitionTransition

Commit resources for the elaboration phase

Lifecycle Objective Milestone

Commit resources for construction

Lifecycle Architecture

Milestone

Product sufficiently mature for customers

Initial Operational Capability Milestone

Customer acceptanceor end of life

Product Release

time

Page 16: 1 Ads

Key RUP Elements: Roles, Activities, Artifacts

D esignerU se-C aseAnalysis

Use-Case Realizations

Role Activity

Artifact

responsible for

performs

Page 17: 1 Ads

Roles Perform Activities and Produce Artifacts

Example:Requirements->

Workflow Detail->

Define the System

C apture a C om m on

Vocabulary

SystemAnalyst

F ind Actors and Use Cases

Use-Case Model(re fined)

Use-CaseModeling

Guidelines

SupplementarySpecifications

Glossary(refined)

Glossary

StakeholderRequests

Use-Case Model

M anage D ependencies

RequirementsManagement

Plan

Vision

Business Use-Case Model

Business Object Model

RequirementsAttributes

RequirementsAttributes

(refined)

D evelopVision

BusinessRules

Vision(refined)

Use Case(outlined)

Page 18: 1 Ads

Key RUP Element: Role A Role defines the behavior and responsibilities of an

individual, or a set of individuals working together as a team.

Team members can “wear different hats,” that is, each member can play more than one Role.

Page 19: 1 Ads

Key RUP Element: Activity

A piece of work a Role is responsible for, that the Role may be asked to perform

Granularity: a few hours to a few days Repeated, as necessary, in each iteration

Page 20: 1 Ads

A document or model produced, evolved, or used by a process

The responsibility of a Role Likely to be subject to

configuration control May contain other artifacts

Key RUP Element: Artifact

Page 21: 1 Ads

The conditional flow of high-level activities (Workflow Details) that produces a result of observable value.

Key RUP Element: Workflow

Page 22: 1 Ads

Workflow Details

Example:Workflow Detail: Prepare Environment for Project

Example:Environment: Workflow

Page 23: 1 Ads

Summary of Major Artifacts

Page 24: 1 Ads

Additional Process Element: Concepts

Attached to the relevant Discipline Explanation of key ideas Examples of Concepts

Requirements Requirements Management Types of Requirements Traceability

Analysis and Design Software Architecture Analysis Mechanisms Web Architecture Patterns

Page 25: 1 Ads

Additional Process Element: Guidelines

These are Rules, recommendations, heuristics that support activities and their steps. They:

Describe specific techniques. Transformations from one artifact to another Use of UML

Are attached to relevant discipline. Are kept short and to the point. Describe well-formed artifacts and focus on qualities. Are used also to assess the quality of artifacts. Are tailored for the project.

Page 26: 1 Ads

Additional Process Element: Tool Mentors

Attached to relevant activity Explain how to use a specific tool to perform an

activity or steps in an activity Linked by default to Rational tools:

RequisitePro: requirements management Rational Rose: visual modeling, using UML SoDA: documentation generation ClearQuest: change management, defect tracking …and more

Page 27: 1 Ads

Additional Process Element: Templates

Attached to relevant document type Predefined artifacts, prototypes:

Documents (Microsoft® Word™, Adobe® Framemaker™)

MS Project HTML

Tailored for the process

Page 28: 1 Ads

Additional Process Element: Roadmap

Roadmaps are used to: Apply the general-purpose process to solve

specific types of problems. Describe process variants using phases. Provide a mechanism for extending and

adapting the process. Highlight certain process features to achieve

a particular goal.

Page 29: 1 Ads

Test Discipline

Page 30: 1 Ads

Test: Discipline Purpose: Testing focuses primarily on the evaluation or

assessment of quality realized through a number of core practices: Finding and documenting defects in software quality. Generally advising about perceived software quality. Proving the validity of the assumptions made in design and

requirement specifications through concrete demonstration. Validating the software product functions as designed. Validating that the requirements have been implemented

appropriately. Test discipline acts in many respects as a service provider to the

other disciplines.

Page 31: 1 Ads

Test: Guidelines Test Case Test Data Test Ideas for Booleans and Boundaries Test Ideas for Method Calls Test Ideas for Statechart and Flow Diagrams Test Plan Test Script Unit Test Workload Analysis Model

Page 32: 1 Ads

Test: Concepts Acceptance Testing Exploratory Testing Key Measures of Test Performance Testing Product Quality Quality Dimensions Stages of Test Structure Testing Test Automation and Tools Test-Ideas Catalog Test-Ideas List Test Strategy The Lifecycle of Testing Types of Test Usability Testing

Page 33: 1 Ads

Test: Concepts: Types of Test Functionality

Function test Security test Volume test

Usability Usability test

Reliability Integrity test Structure test Stress test

Performance Benchmark test Contention test Load test Performance profile

Supportability Configuration test Installation test

Page 34: 1 Ads

Test: Activities and Roles

Page 35: 1 Ads

Test: Artifacts and Roles

Page 36: 1 Ads

Test: Workflow

Page 37: 1 Ads

Test: Define Test Mission

Identifying the objectives for the testing effort and deliverable artifacts.

Identifying a good resource utilization strategy. Defining the appropriate scope and boundaries for

the test effort. Outlining the approach that will be used, including

the tool automation. Defining how progress will be monitored and

assessed.

Page 38: 1 Ads

Test: Verify Test Approach Verifying early that the intended Test Approach will

work and that it produces results of value. Establishing the basic infrastructure to enable and

support the Test Approach. Obtaining commitment from the development team

to provide and support the required testability to achieve the Test Approach.

Identifying the scope, boundaries, limitations, and constraints of each tool and technique.

Page 39: 1 Ads

Test: Validate Build Stability (Smoke Test) Making an assessment of the stability and testability of the

build: Can you install it, load it, and start it? Gaining an initial understanding—or confirming the

expectation—of the development work delivered in the build: What was effectively integrated into this build?

Making a decision to accept the build as suitable for use—guided by the evaluation mission—in further testing, or to conduct further testing against a previous build. Again, not all builds are suitable for a test cycle, and there is no point wasting too much testing time and effort on an unsatisfactory build.

Page 40: 1 Ads

Test: Test and Evaluate

Providing ongoing evaluation and assessment of the Target Test Items.

Recording the appropriate information necessary to diagnose and resolve any identified issues.

Achieving suitable breadth and depth in the test and evaluation work.

Providing feedback on the most likely areas of potential quality risk.

Page 41: 1 Ads

Test: Achieve an Acceptable Mission

Actively prioritizing the minimal set of necessary tests that must be conducted to achieve the Evaluation Mission.

Advocating the resolution of important issues that have a significant negative impact on the Evaluation Mission.

Advocating appropriate product quality. Identifying regressions in quality introduced between test

cycles. Where appropriate, revising the Evaluation Mission in light of

the evaluation findings so as to provide useful evaluation information to the project team.

Page 42: 1 Ads

Test: Improve Test Assets Adding the minimal set of additional tests to validate the stability of

subsequent builds. Assembling Test Scripts into additional appropriate Test Suites. Removing test assets that no longer serve a useful purpose or have

become uneconomic to maintain. Maintaining Test Environment Configurations and Test Data sets. Exploring opportunities for reuse and productivity improvements. Conducting general maintenance of and making improvements to the

maintainability of test automation assets. Documenting lessons learned—both good and bad practices discovered

during the test cycle. This should be done at least at the end of the iteration.

Page 43: 1 Ads

Test: Workflow Detail: Test and Evaluate

Page 44: 1 Ads

Test: Simple Workflow

Tester

Run

Test Log

Tester

Code

Test Scripts

Test Designer

Model

Test Case

Test Manager

Plan

Test Plan

Test Analist

Report

Defect Log

Test Management Tool

Test Management ToolTest Script Tool

Page 45: 1 Ads

Test: Tool Support

Rational TestStudio® Robot. LogViewer TestManager Rational Test Realtime®

Rational XDE Tester® Rational PerformanceStudio® Rational PurifyPlus®

Rational Purify® Rational PureCoverage® Rational Quantify®

Page 46: 1 Ads

Avaliação de Desempenho de Sistemas de Informação

MSc. Luiz Barboza

[email protected]

http://barbozaluiz.blogspot.com/