1 Ads

Preview:

DESCRIPTION

 

Citation preview

Avaliação de Desempenho de Sistemas de Informação

MSc. Luiz Barboza

luiz.barboza@gmail.com

http://barbozaluiz.blogspot.com/

Sobre mim... Mestre em Ciência da Computação com 10 anos de experiência de

mercado, atuando como Arquiteto de Software, com 04 anos direcionados para a área Financeira e com os 03 últimos focados na Garantia de Qualidade para o setor de Telecomunicações.

Qualificação Mestre em Engenharia de Software pelo IPT/USP MBA em Gestão Empresarial pela FGV Especialista em Gestão de TI pela FIAP Bacharel em Ciência da Computação pela UFPE

Certificações SCEA - Sun Certified Enterprise Architect TIBCO Enterprise Message Service Certified ITIL - ITIL Foundation Certified Professional IBM/Rational Specialist for Rational Requirements Management with Use

Cases(+ReqPro) IBM/Rational Certified Solution Designer - IBM Rational Unified Process V7.0 IBM/Rational Solution Designer – Object Oriented Analysis and Design(+Rose) SCWCD - Sun Certified Web Component Developer for the J2EE SCPJ - Sun Certified Programmer for Java 2 Platform

Programação Ementa

Aborda noção de carga de trabalho ("workload") e a sua caracterização; técnicas de modelagem de sistemas; ferramentas e metodologias para obtenção de dados de sistemas; medidas de desempenho: orientadas a usuário e orientadas ao sistema; ferramentas de medidas de desempenho: monitores de hardware e software;paradigmas("benchmarks"), simulação de sistemas e estudo de casos.

Objetivos Estudar conceitos e técnicas para dimensionar ou avaliar sistemas de

informação. Aplica ferramentas para a avaliação desses sistemas e os resultados da análise de desempenho para a sua otimização.

Bibliografia CHWIF, L., MEDINA, A. C. Modelagem e Simulação de Eventos

Discretos: Teoria e Aplicações. 1ª ed. São Paulo: Bravarte, 2006. TARTUGNO, A. F., DiPASQUALE, T. R., MATTHEWS, R. E. It Services:

Costs, Metrics, Benchmarking and Marketing. Safari Books online, Prentice Hall, 2000.

LOUKIDES, M., MUSUMECI, G. D. System Performance Tuning. 2a ed, Safari Books online, O´Reilly, 2002.   

JAIN, J. The art of computer systems performance analysis. Nova York: John Wiley & Sons, 1991.

Avaliação 2 Exames individuais discursivos.

Agenda

Princípios de análise desempenho Planejamento e Preparação para o Teste de

Performance Stress Testes Monitoramento aplicação desempenho Análise de desempenho do trafego de rede Analise e Ajuste de Performance na camada Web Análise de Desempenho de código Analise da camada de dados Estimativa de Capacidade Modelagem de Desempenho

Test: Simple Workflow

Tester

Run

Test Log

Tester

Code

Test Scripts

Test Designer

Model

Test Case

Test Manager

Plan

Test Plan

Test Analist

Report

Defect Log

Test Management Tool

Test Management ToolTest Script Tool

Workload Model

Requisitos de Performance

ACT

JMeter

RPT

RUP Overview

Focused on the Test Discipline

RUP Structure

Organization along time Lifecycle structure: phases, iterations Process enactment: planning, executing Activity management, project control

Organization based on content Disciplines, roles, artifacts, activities Process configuration, process enhancement

Organization Along Time

Phases and Iterations

Commit resources for the Commit resources for the elaboration phase elaboration phase

Commit resources Commit resources for constructionfor construction

Product sufficiently mature for Product sufficiently mature for customers to usecustomers to use

(Understand the problem)(Understand the problem) (Understand the solution)(Understand the solution) (Have a solution)(Have a solution)

AcceptanceAcceptanceor end of lifeor end of life

Planned (Business) Decision PointsPlanned (Business) Decision Points

PreliminaryPreliminaryIterationIteration

Architect.Architect.IterationIteration

Architect.Architect.IterationIteration

Devel. Devel. IterationIteration

Devel. Devel. IterationIteration

Devel. Devel. IterationIteration

TransitionTransitionIterationIteration

TransitionTransitionIterationIteration

Planned (Technical) Visibility PointsPlanned (Technical) Visibility Points

InceptionInception ElaborationElaboration ConstructionConstruction TransitionTransition

Major Milestones: Business Decision Points

InceptionInception ElaborationElaboration ConstructionConstruction TransitionTransition

Commit resources for the elaboration phase

Lifecycle Objective Milestone

Commit resources for construction

Lifecycle Architecture

Milestone

Product sufficiently mature for customers

Initial Operational Capability Milestone

Customer acceptanceor end of life

Product Release

time

Key RUP Elements: Roles, Activities, Artifacts

D esignerU se-C aseAnalysis

Use-Case Realizations

Role Activity

Artifact

responsible for

performs

Roles Perform Activities and Produce Artifacts

Example:Requirements->

Workflow Detail->

Define the System

C apture a C om m on

Vocabulary

SystemAnalyst

F ind Actors and Use Cases

Use-Case Model(re fined)

Use-CaseModeling

Guidelines

SupplementarySpecifications

Glossary(refined)

Glossary

StakeholderRequests

Use-Case Model

M anage D ependencies

RequirementsManagement

Plan

Vision

Business Use-Case Model

Business Object Model

RequirementsAttributes

RequirementsAttributes

(refined)

D evelopVision

BusinessRules

Vision(refined)

Use Case(outlined)

Key RUP Element: Role A Role defines the behavior and responsibilities of an

individual, or a set of individuals working together as a team.

Team members can “wear different hats,” that is, each member can play more than one Role.

Key RUP Element: Activity

A piece of work a Role is responsible for, that the Role may be asked to perform

Granularity: a few hours to a few days Repeated, as necessary, in each iteration

A document or model produced, evolved, or used by a process

The responsibility of a Role Likely to be subject to

configuration control May contain other artifacts

Key RUP Element: Artifact

The conditional flow of high-level activities (Workflow Details) that produces a result of observable value.

Key RUP Element: Workflow

Workflow Details

Example:Workflow Detail: Prepare Environment for Project

Example:Environment: Workflow

Summary of Major Artifacts

Additional Process Element: Concepts

Attached to the relevant Discipline Explanation of key ideas Examples of Concepts

Requirements Requirements Management Types of Requirements Traceability

Analysis and Design Software Architecture Analysis Mechanisms Web Architecture Patterns

Additional Process Element: Guidelines

These are Rules, recommendations, heuristics that support activities and their steps. They:

Describe specific techniques. Transformations from one artifact to another Use of UML

Are attached to relevant discipline. Are kept short and to the point. Describe well-formed artifacts and focus on qualities. Are used also to assess the quality of artifacts. Are tailored for the project.

Additional Process Element: Tool Mentors

Attached to relevant activity Explain how to use a specific tool to perform an

activity or steps in an activity Linked by default to Rational tools:

RequisitePro: requirements management Rational Rose: visual modeling, using UML SoDA: documentation generation ClearQuest: change management, defect tracking …and more

Additional Process Element: Templates

Attached to relevant document type Predefined artifacts, prototypes:

Documents (Microsoft® Word™, Adobe® Framemaker™)

MS Project HTML

Tailored for the process

Additional Process Element: Roadmap

Roadmaps are used to: Apply the general-purpose process to solve

specific types of problems. Describe process variants using phases. Provide a mechanism for extending and

adapting the process. Highlight certain process features to achieve

a particular goal.

Test Discipline

Test: Discipline Purpose: Testing focuses primarily on the evaluation or

assessment of quality realized through a number of core practices: Finding and documenting defects in software quality. Generally advising about perceived software quality. Proving the validity of the assumptions made in design and

requirement specifications through concrete demonstration. Validating the software product functions as designed. Validating that the requirements have been implemented

appropriately. Test discipline acts in many respects as a service provider to the

other disciplines.

Test: Guidelines Test Case Test Data Test Ideas for Booleans and Boundaries Test Ideas for Method Calls Test Ideas for Statechart and Flow Diagrams Test Plan Test Script Unit Test Workload Analysis Model

Test: Concepts Acceptance Testing Exploratory Testing Key Measures of Test Performance Testing Product Quality Quality Dimensions Stages of Test Structure Testing Test Automation and Tools Test-Ideas Catalog Test-Ideas List Test Strategy The Lifecycle of Testing Types of Test Usability Testing

Test: Concepts: Types of Test Functionality

Function test Security test Volume test

Usability Usability test

Reliability Integrity test Structure test Stress test

Performance Benchmark test Contention test Load test Performance profile

Supportability Configuration test Installation test

Test: Activities and Roles

Test: Artifacts and Roles

Test: Workflow

Test: Define Test Mission

Identifying the objectives for the testing effort and deliverable artifacts.

Identifying a good resource utilization strategy. Defining the appropriate scope and boundaries for

the test effort. Outlining the approach that will be used, including

the tool automation. Defining how progress will be monitored and

assessed.

Test: Verify Test Approach Verifying early that the intended Test Approach will

work and that it produces results of value. Establishing the basic infrastructure to enable and

support the Test Approach. Obtaining commitment from the development team

to provide and support the required testability to achieve the Test Approach.

Identifying the scope, boundaries, limitations, and constraints of each tool and technique.

Test: Validate Build Stability (Smoke Test) Making an assessment of the stability and testability of the

build: Can you install it, load it, and start it? Gaining an initial understanding—or confirming the

expectation—of the development work delivered in the build: What was effectively integrated into this build?

Making a decision to accept the build as suitable for use—guided by the evaluation mission—in further testing, or to conduct further testing against a previous build. Again, not all builds are suitable for a test cycle, and there is no point wasting too much testing time and effort on an unsatisfactory build.

Test: Test and Evaluate

Providing ongoing evaluation and assessment of the Target Test Items.

Recording the appropriate information necessary to diagnose and resolve any identified issues.

Achieving suitable breadth and depth in the test and evaluation work.

Providing feedback on the most likely areas of potential quality risk.

Test: Achieve an Acceptable Mission

Actively prioritizing the minimal set of necessary tests that must be conducted to achieve the Evaluation Mission.

Advocating the resolution of important issues that have a significant negative impact on the Evaluation Mission.

Advocating appropriate product quality. Identifying regressions in quality introduced between test

cycles. Where appropriate, revising the Evaluation Mission in light of

the evaluation findings so as to provide useful evaluation information to the project team.

Test: Improve Test Assets Adding the minimal set of additional tests to validate the stability of

subsequent builds. Assembling Test Scripts into additional appropriate Test Suites. Removing test assets that no longer serve a useful purpose or have

become uneconomic to maintain. Maintaining Test Environment Configurations and Test Data sets. Exploring opportunities for reuse and productivity improvements. Conducting general maintenance of and making improvements to the

maintainability of test automation assets. Documenting lessons learned—both good and bad practices discovered

during the test cycle. This should be done at least at the end of the iteration.

Test: Workflow Detail: Test and Evaluate

Test: Simple Workflow

Tester

Run

Test Log

Tester

Code

Test Scripts

Test Designer

Model

Test Case

Test Manager

Plan

Test Plan

Test Analist

Report

Defect Log

Test Management Tool

Test Management ToolTest Script Tool

Test: Tool Support

Rational TestStudio® Robot. LogViewer TestManager Rational Test Realtime®

Rational XDE Tester® Rational PerformanceStudio® Rational PurifyPlus®

Rational Purify® Rational PureCoverage® Rational Quantify®

Avaliação de Desempenho de Sistemas de Informação

MSc. Luiz Barboza

luiz.barboza@gmail.com

http://barbozaluiz.blogspot.com/

Recommended