ISACA’s COBIT® Assessment Programme ISACA’s COBIT® Assessment Programme
Presented by:
An understanding of the new COBIT assessment programme
An understanding of the relationship to ISO/IEC 15504 and why ISACA selected this standard
A walk through with one of the key COBIT 4.1 processes DS1 Define and manage service levels
Session Objectives
Copyright ISACA 2012. All rights reserved Slide 2
ISO/IEC 15504-4 identifies process assessment as an activity that can be performed either as part of a process improvement initiative or as part of a capability determination approach
The purpose of process improvement is to continually improve the enterprise’s effectiveness and efficiency
The purpose of process capability determination is to identify the strengths, weaknesses and risk of selected processes with respect to a particular specified requirement through the processes used and their alignment with the business need
It provides an understandable, logical, repeatable, reliable and robust methodology for assessing the capability of IT processes
What is A Process Assessment?
Copyright ISACA 2012. All rights reserved Slide 3
The COBIT Assessment Programme includes:• COBIT Process Assessment Model (PAM): Using
COBIT 4.1• COBIT Assessor Guide: Using COBIT 4.1• COBIT Self Assessment Guide: Using COBIT 4.1
The COBIT PAM brings together two proven heavyweights in the IT arena, ISO and ISACA
The COBIT PAM adapts the existing COBIT 4.1 content into an ISO 15504 compliant process assessment model
What is the new COBIT Assessment Programme?
Copyright ISACA 2012. All rights reserved Slide 4
But don’t we already have maturity models for COBIT 4.1 processes?
The new COBIT assessment programme is:• A robust assessment process based on ISO 15504• An alignment of COBIT’s maturity model scale with the international
standard• A new capability-based assessment model which includes:
• Specific process requirements derived from COBIT 4.1• Ability to achieve process attributes based on ISO 15504• Evidence requirements• Assessor qualifications and experiential requirements
Results in a more robust, objective and repeatable assessment
Assessment results will likely vary from existing COBIT maturity models!
What’s different?
Copyright ISACA 2012. All rights reserved Slide 5
The COBIT 4.1 PAM uses a measurement framework that is similar in terminology to the existing maturity models in COBIT 4.1
While the words are similar the scales are NOT the same:• The COBIT PAM uses the capability scale from ISO/IEC 15504, whereas the existing COBIT
maturity models uses a scale derived from SEI\CMM• A PAM level 3 is NOT the same as a CMM level 3• Assessments done under the PAM are likely to result in ‘lower’ scores• PAM assessments are based on more fully defined and defensible attributes
Differences to COBIT Maturity Model
COBIT 4.1 Process Maturity Level
ISO/IEC 15504 ProcessCapability Level Attribute
5 Optimised 5 Optimizing PA 5.1 Process innovationPA 5.2 Process optimization
4 Managed and measurable
4 Predictable PA 4.1 Process measurementPA 4.2 Process control
3 Defined 3 Established PA 3.1 Process definitionPA 3.2 Process deployment
2 Repeatable but intuitive
2 Managed PA 2.1Performance managementPA 2.2 Work product management
1 Initial/ad hoc 1 Performed PA 1.1 Process performance0 Non-existent 0 Incomplete
Copyright ISACA 2012. All rights reserved Slide 6
Assessment Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Process Assessment Model
Assessment Process
Copyright ISACA 2012. All rights reserved Slide 7
Process Reference Model
The high-level measurable objectives of performing the process and the likely outcomes of effective implementation of the process
Copyright ISACA 2012. All rights reserved Slide 8
Process Reference Model
The activities that, when consistently performed, contribute to achieving the process purpose
The artefacts associated with the execution of a process – defined in terms or process ‘inputs’ and process ‘outputs’
An observable result of a process - an artefact, a significant change of state or the meeting of specified constraints
Copyright ISACA 2012. All rights reserved Slide 9
PRM Based on COBIT 4.1Process ID DS1Process Name Define and Manage Service LevelsPurpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs.Outcomes (Os) Number Description
DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers.
DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities.DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs.DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements.
Base Practices (BPs)
Number Description SupportsDS1-BP1 Create a framework for defining IT services. DS1-O1DS1-BP2 Build an IT service catalogue. DS1-O1, O2DS1-BP3 Define SLAs for critical IT services. DS1-O2DS1-BP4 Define OLAs for meeting SLAs. DS1-O3DS1-BP5 Monitor and report end-to-end service level performance. DS1-O4DS1-BP6 Review SLAs and underpinning contracts. DS1-O4DS1-BP7 Review and update the IT service catalogue. DS1-O1DS1-BP8 Create a service improvement plan. DS1-O1
Work Products (WPs)Inputs
Number Description SupportsPO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4PO1-WP4 IT service portfolio DS1-O1, O2, O3, O4PO2-WP5 Assigned data classifications DS1-O1PO5-WP3 Updated IT service portfolio DS1-O4AI2-WP4 Initial planned SLAs DS1-O3AI3-WP7 Initial planned OLAs DS1-O3DS4-WP5 Disaster service requirements, including roles and responsibilities DS1-O1ME1-WP1 Performance input to IT planning DS1-O1, O2
OutputsNumber Description Input To Supports
DS1-WP1 Contract review report DS2 DS1-O1, O4DS1-WP2 Process performance reports ME1 DS1-O4DS1-WP3 New/updated service requirements PO1 DS1-O2, O3DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-O2DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-O3DS1-WP6 Updated IT service portfolio PO1 DS1-O1, O4
Copyright ISACA 2012. All rights reserved Slide 10
PRM Based on COBIT 4.1Process ID DS1Process Name Define and Manage Service LevelsPurpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs.Outcomes (Os) Number Description
DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers.
DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities.DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs.DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements.
Base Practices (BPs)
Number Description SupportsDS1-BP1 Create a framework for defining IT services. DS1-O1DS1-BP2 Build an IT service catalogue. DS1-O1, O2DS1-BP3 Define SLAs for critical IT services. DS1-O2DS1-BP4 Define OLAs for meeting SLAs. DS1-O3DS1-BP5 Monitor and report end-to-end service level performance. DS1-O4DS1-BP6 Review SLAs and underpinning contracts. DS1-O4DS1-BP7 Review and update the IT service catalogue. DS1-O1DS1-BP8 Create a service improvement plan. DS1-O1
Work Products (WPs)Inputs
Number Description SupportsPO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4PO1-WP4 IT service portfolio DS1-O1, O2, O3, O4PO2-WP5 Assigned data classifications DS1-O1PO5-WP3 Updated IT service portfolio DS1-O4AI2-WP4 Initial planned SLAs DS1-O3AI3-WP7 Initial planned OLAs DS1-O3DS4-WP5 Disaster service requirements, including roles and responsibilities DS1-O1ME1-WP1 Performance input to IT planning DS1-O1, O2
OutputsNumber Description Input To Supports
DS1-WP1 Contract review report DS2 DS1-O1, O4DS1-WP2 Process performance reports ME1 DS1-O4DS1-WP3 New/updated service requirements PO1 DS1-O2, O3DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-O2DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-O3DS1-WP6 Updated IT service portfolio PO1 DS1-O1, O4
Copyright ISACA 2012. All rights reserved Slide 11
Assessment Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 12
Process Capability Levels
Level 0 Incomplete process
Level 0 Incomplete process
IncompleteThe process is not implemented or fails to achieve its purpose
Level 1 Performed processPA 1.1 Process performance attribute
Level 1 Performed processPA 1.1 Process performance attribute
PerformedThe process is implemented and achieves its process purpose
Level 2 Managed processPA 2.1 Performance management attribute
PA 2.2 Work product management attribute
Level 2 Managed processPA 2.1 Performance management attribute
PA 2.2 Work product management attribute
ManagedThe process is managed and work products are established, controlled and maintained
Level 4 Predictable processPA 4.1 Process measurement attribute
PA 4.2 Process control attribute
Level 4 Predictable processPA 4.1 Process measurement attribute
PA 4.2 Process control attribute
PredictableThe process is enacted consistently within defined limits
Level 5 Optimizing processPA 5.1 Process innovation attribute
PA 5.2 Process optimization attribute
Level 5 Optimizing processPA 5.1 Process innovation attribute
PA 5.2 Process optimization attribute
OptimizingThe process is continuously improved to meet relevant current and projected business goals
Level 3 Established processPA 3.1 Process definition attribute
PA 3.2 Process deployment attribute
Level 3 Established processPA 3.1 Process definition attribute
PA 3.2 Process deployment attribute
EstablishedA defined process is used based on a standard process
Copyright ISACA 2012. All rights reserved Slide 13
COBIT assessment process measures the extent to which a given process achieves specific attributes relative to that process— ‘process attributes’
COBIT assessment process defines 9 process attributes (based on ISO/IEC 15504-2)• PA 1.1 Process performance• PA 2.1 Performance management• PA 2.2 Work product management• PA 3.1 Process definition• PA 3.2 Process deployment• PA 4.1 Process measurement• PA 4.2 Process control• PA 5.1 Process innovation• PA 5.2 Continuous optimisation
Measurement Framework
Copyright ISACA 2012. All rights reserved Slide 14
PA 1.1 Process performance• The process performance attribute is a measure of the extent to which
the process purpose is achieved.• As a result of full achievement of this attribute, the process achieves its
defined outcomes.
Process Attributes (example)
Copyright ISACA 2012. All rights reserved Slide 15
PA 2.1 Performance management• A measure of the extent to which the performance of the process is managed. As a
result of full achievement of this attribute: a. Objectives for the performance of the process are identified.
b. Performance of the process is planned and monitored.
c. Performance of the process is adjusted to meet plans.
d. Responsibilities and authorities for performing the process are defined, assigned and communicated.
e. Resources and information necessary for performing the process are identified, made available, allocated and used.
f. Interfaces between the involved parties are managed to ensure effective communication and clear assignment of responsibility.
PA 2.2 Work product management• A measure of the extent to which the work products produced by the process are
appropriately managed. As a result of full achievement of this attribute: a. Requirements for the work products of the process are defined.
b. Requirements for documentation and control of the work products are defined.
c. Work products are appropriately identified, documented and controlled.
d. Work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements.
Process Attributes (example)
Copyright ISACA 2012. All rights reserved Slide 16
COBIT assessment process measures the extent to which a given process achieves the ‘process attributes’
Process Attribute Rating Scale
N Not achieved—0 to 15% achievement There is little or no evidence of achievement of the defined attribute in the assessed process
P Partially achieved—> 15% to 50% achievementThere is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable
L Largely achieved—> 50% to 85% achievement There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process
F Fully achieved—> 85% to 100% achievement There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process
Copyright ISACA 2012. All rights reserved Slide 17
PA 2.2 Work product management
PA 2.1 Performance managementLevel 2 - Managed
PA 1.1 Process performanceLevel 1 - Performed
Level 0 - Incomplete
PA 3.2 Deployment
PA 3.1 DefinitionLevel 3 - Established
PA 4.2 Control
PA 4.1 MeasurementLevel 4 - Predictable
PA 5.1 Innovation
PA 5.2 OptimizationLevel 5 - Optimizing
1
L/F
2
L/F
F
F
3
L/F
F
4
L/F
F
F
F
L/F
5
F
F
F
F
L/F = Largely or Fully F= Fully
Process Attribute Ratings and Capability Levels
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 18
COBIT Assessment Process Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 19
Process Attributes and Capability Levels
Incomplete
Performed
Managed
Established
Predictable
Optimizing
Slide 20
9 Process Attributes Process Attribute Indicators (PAI)
COBIT
ISO
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Process Attributes and Capability Levels
Incomplete
Performed
Managed
Established
Predictable
Optimizing
Slide 21This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Process Attribute Rating
Assessment indicators in the PAM are used to support the assessors’ judgement in rating process attributes: • Provide the basis for repeatability across assessments
A rating is assigned based on objective, validated evidence for each process attribute
Traceability needs to be maintained between an attribute rating and the objective evidence used in determining that rating
Copyright ISACA 2012. All rights reserved Slide 22
Example from COBIT 4.1: DS1 Define and manage service levels
Copyright ISACA 2012. All rights reserved Slide 23
Process Reference Model Example DS1
Process ID DS1Process Name Define and Manage Service LevelsPurpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs.Outcomes (Os) Number Description
DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers.
DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities.DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs.DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements.
Base Practices (BPs)
Number Description SupportsDS1-BP1 Create a framework for defining IT services. DS1-O1DS1-BP2 Build an IT service catalogue. DS1-O1, O2DS1-BP3 Define SLAs for critical IT services. DS1-O2DS1-BP4 Define OLAs for meeting SLAs. DS1-O3DS1-BP5 Monitor and report end-to-end service level performance. DS1-O4DS1-BP6 Review SLAs and underpinning contracts. DS1-O4DS1-BP7 Review and update the IT service catalogue. DS1-O1DS1-BP8 Create a service improvement plan. DS1-O1
Work Products (WPs)Inputs
Number Description SupportsPO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4PO1-WP4 IT service portfolio DS1-O1, O2, O3, O4PO2-WP5 Assigned data classifications DS1-O1PO5-WP3 Updated IT service portfolio DS1-O4AI2-WP4 Initial planned SLAs DS1-O3AI3-WP7 Initial planned OLAs DS1-O3DS4-WP5 Disaster service requirements, including roles and responsibilities DS1-O1ME1-WP1 Performance input to IT planning DS1-O1, O2
OutputsNumber Description Input To Supports
DS1-WP1 Contract review report DS2 DS1-O1, O4DS1-WP2 Process performance reports ME1 DS1-O4DS1-WP3 New/updated service requirements PO1 DS1-O2, O3DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-O2DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-O3DS1-WP6 Updated IT service portfolio PO1 DS1-O1, O4
Copyright ISACA 2012. All rights reserved Slide 24
Does the process achieve its defined outcomes (PA1.1)?– As evidenced by: • Production of an object • A significant change of state;• Meeting of specified constraints, e.g., requirements, goals
N Not achieved 0 to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement.
Assessing Process Capability
Figure 6—PA1.1 Process Performance
Result of Full Achievement of the Attribute
Base Practices (BPs) Work Products (WPs)
The process achieves its defined outcomes.
BP 1.1.1 Achieve the process outcomes. There is evidence that the intent of base practice is being performed.
Work products are produced that provide evidence of process outcomes, as outlined in section 3.
Copyright ISACA 2012. All rights reserved Slide 25
Assessing Process Capability
PA 2.1 Performance managementa. Have objectives for the performance of the process been identified?
b. Is performance of the process planned and monitored?
c. Is performance of the process adjusted to meet plans?
d. Are responsibilities and authorities for performing the process defined, assigned and communicated?
e. Are resources and information necessary for performing the process identified, made available, allocated and used?
f. Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility?
N Not achieved 0 to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement
Copyright ISACA 2012. All rights reserved Slide 26
Assessing Process Capability
PA 2.1 Performance managementa. Have objectives for the performance of the process been identified?
b. Is performance of the process planned and monitored?
c. Is performance of the process adjusted to meet plans?
d. Are responsibilities and authorities for performing the process defined, assigned and communicated?
e. Are resources and information necessary for performing the process identified, made available, allocated and used?
f. Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility?
N Not achieved 0 to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement.
Copyright ISACA 2012. All rights reserved Slide 27
Assessing Process Capability
PA 2.2 Work product managementa. Have requirements for the work products of the process been defined?
b. Have requirements for documentation and control of the work products been defined?
c. Are work products appropriately identified, documented and controlled?
d. Are work products reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements?
N Not achieved 0 to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement
Copyright ISACA 2012. All rights reserved Slide 28
Fully
PA 1.1 Process performance
PA 2.2 Work product management
PA 2.1 Performance management
PA 3.2 Deployment
PA 3.1 Definition
PA 4.2 Control
PA 4.1 Measurement
PA 5.1 Innovation
PA 5.2 Optimisation
Assessing Attribute Achievement
Attribute Achievement
Not Partially Largely
Copyright ISACA 2012. All rights reserved Slide 29
PA1.1 Process performance
PA 2.2 Work product management
PA 2.1 Performance management
PA 3.2 Deployment
PA 3.1 Definition
PA 4.2 Control
PA 4.1 Measurement
PA 5.1 Innovation
PA 5.2 Optimisation
Level 1 Performed
F
F
F
F
F
F
F
F
F
F
L/F = Largely or Fully F= Fully
Assessing Process Capability Levels
L/F
Level 2 Managed
Level 3 Established
Level 4 Predictable
Level 5 Optimising
L/F
L/F
L/F
L/F
Level 0 Incomplete
Copyright ISACA 2012. All rights reserved Slide 30
Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 31
1 Initiation
2 Planning the assessment
3 Briefing
4 Data collection
5 Data validation
6 Process attributes rating
7 Reporting the results
Assessment Process Activities
Copyright ISACA 2012. All rights reserved Slide 32
Identify the sponsor and define the purpose of the assessment: • Why it is being carried out?
Define the scope of the assessment:• Which processes are being assessed?• What constraints, if any, apply to the assessment?
Identify any additional information that needs to be gathered Select the assessment participants, the assessment team and
define the roles of team members Define assessment inputs and outputs:
• Have them approved by the sponsor
1. Initiation
Copyright ISACA 2012. All rights reserved Slide 33
SCOPING GUIDANCE
Process Assessment Model Walkthrough
Copyright ISACA 2012. All rights reserved Slide 34
The aim of the scoping as part of Assessment Initiation is to focus on the assessment on the business needs of the enterprise. This reduces the overall effort involved the assessment
One of the benefits of using COBIT 4.1 as the process reference model is that it has extensive validated mappings from business objectives, and IT Objectives and IT processes. [COBIT 4.1 Appendix 1]. These are available in the tool kit
There is a six Step Selection Process: Step 1 Identify relevant business drivers for the IT processes assessment. Step 2 Prioritise the enterprise’s IT processes that may be included within the
scope of the assessment Step 3 Perform a preliminary selection of target processes for inclusion in the
assessment, based on the above prioritisation Step 4 Confirm the preliminary selection of target processes with the project
sponsor and key stakeholders of the process assessment Step 5 Finalise the processes to be included in the assessment Step 6 Document the scoping methodology in the assessment records
Scoping ..1
Copyright ISACA 2012. All rights reserved Slide 35
Available Mappings Linking Business Goals to IT Goals Linking IT Goals to IT processes Mapping IT processes to IT governance focus areas and
COSO US Sarbanes-Oxley Act Cloud Computing Self Diagnostic
Scoping ..2
Copyright ISACA 2012. All rights reserved Slide 36
Available Mappings Linking Business Goals to IT Goals Linking IT Goals to IT processes
Scoping ..3
Copyright ISACA 2012. All rights reserved Slide 37
An assessment plan describing all activities performed in conducting the assessment is: • Developed• Documented together with• An assessment schedule
Identify the project scope Secure the necessary resources to perform the assessment Determine the method of collating, reviewing, validating and
documenting the information required for the assessment Co-ordinate assessment activities with the organisational
unit being assessed
2. Planning the Assessment
Copyright ISACA 2012. All rights reserved Slide 38
The assessment team leader ensures that the assessment team understands the assessment: • Input• Process• Output
Brief the organisational unit on the performance of the assessment: • PAM, assessment scope, scheduling, constraints, roles and
responsibilities, resource requirements, etc.
3. Briefing
Copyright ISACA 2012. All rights reserved Slide 39
The assessor obtains (and documents) an understanding of the process(es) including process purpose, inputs, outputs and work products, sufficient to enable and support the assessment
Data required for evaluating the processes within the scope of the assessment are collected in a systematic manner
The strategy and techniques for the selection, collection, analysis of data and justification of the ratings are explicitly identified and demonstrable
Each process identified in the assessment scope is assessed on the basis of objective evidence: The objective evidence gathered for each attribute of each process
assessed must be sufficient to meet the assessment purpose and scope Objective evidence that supports the assessors’ judgement of process attribute
ratings is recorded and maintained in the assessment record• This record provides evidence to substantiate the ratings and to verify
compliance with the requirements
4. Data Collection
Copyright ISACA 2012. All rights reserved Slide 40
Actions are taken to ensure that the data are accurate and sufficiently cover the assessment scope, including: • Seeking information from firsthand, independent sources • Using past assessment results• Holding feedback sessions to validate the information collected
Some data validation may occur as the data is being collected
5. Data Validation
Copyright ISACA 2012. All rights reserved Slide 41
For each process assessed, a rating is assigned for each process attribute up to and including the highest capability level defined in the assessment scope
The rating is based on data validated in the previous activity Traceability must be maintained between the objective evidence
collected and the process attribute ratings assigned For each process attribute rated, the relationship between the
indicators and the objective evidence is recorded
6. Process Attribute Rating
Copyright ISACA 2012. All rights reserved Slide 42
The results of the assessment are analysed and presented in a report
The report also covers any key issues raised during the assessment such as:• Observed areas of strength and weakness• Findings of high risk, i.e., magnitude of gap between assessed
capability and desired/required capability
7. Reporting the Results
Copyright ISACA 2012. All rights reserved Slide 43
Level 1 Level 2
PA 1.1 PA 2.1 PA 2.2 PA 3.1 PA 3.2
Level 3
Process A Target Capability
Assessed
Process C Target Capability
Assessed
L
L LF
L LF F F
Process B Target Capability
Assessed
Target Process Capabilities (example)
Copyright ISACA 2012. All rights reserved Slide 44
Figure A.3—Consequence of Gaps at Various Capability Levels
Consequence of Capability Gaps
This figure is reproduced from ISO 15504-4 2006 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 45
Figure A.4—Risk Associated With Each Capability Level
Capability Gaps and Risk
This figure is reproduced from ISO 15504-4 2006 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2012. All rights reserved Slide 46
COBIT process assessment roles:• Lead assessor—a ‘competent’ assessor responsible for overseeing the
assessment activities• Assessor—an individual, developing assessor competencies, who
performs the assessment activities Assessor competencies:
• Knowledge, skills and experience:• With the process reference model; process assessment model,
methods and tools; and rating processes• With the processes/domains being assessed• Personal attributes that contribute to effective performance
A training and certification scheme is being developed for COBIT 4.1 and COBIT 5
Assessor Certification
Copyright ISACA 2012. All rights reserved Slide 47
COBIT Assessment Programme: www.isaca.org/cobit-assessment-programme
Contact Information: [email protected]
And so Goodbye . . .
Questio
ns?
Copyright ISACA 2012. All rights reserved Slide 48