Upload
measure-evaluation
View
2.847
Download
0
Embed Size (px)
DESCRIPTION
Presented by David Boone on July 19, 2012.
Citation preview
MEASURE Evaluation Data Quality Assessment Methodology and Tools
USAIDWashington D.C.
July 19, 2012
■ National Programs and Donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and Malaria.
■ Measuring success and improving management of these initiatives is predicated on strong M&E systems that produce quality data regarding program implementation.
■ In the spirit of the “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched 2006 to develop a joint Data Quality Audit (DQA) Tool.
■ The objective of this initiative is to provide a common approach for assessing and improving data quality. A single tool ensures that standards are harmonized and allows for joint implementation (between partners and with National Programs).
Background
* Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE Evaluation.
Outline Auditing
DQA Evolution Methods and tools How and where applied
Routine Data Quality Assurance and Capacity Building RDQA
Purpose Methods and tools How and where applied Integration into Routine Practice
assess the ability of data-management systems to collect manage and report quality data.
verify the quality of reported data for key indicators at selected sites; and
Objective of the Data Quality Audit (DQA) Tool
The Data Quality Audit (DQA) Tool is designed to:
5
Conceptual Framework
Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data.
REP
OR
TIN
G L
EVEL
S
Service Points
Intermediate Aggregation Levels (e.g. Districts, Regions)
M&E Unit
QUALITY DATA Accuracy, Completeness, Reliability, Timeliness,
Confidentiality, Precision, Integrity
Dimensions of Quality
Data quality mechanisms and controlsVII
Data management processes VI
Links with National Reporting System V
Links with the national reporting systemVIII
Data Management ProcessesIV
Data-collection and Reporting Forms / ToolsIII
Indicator Definitions and Reporting GuidelinesII
M&E Structure, Functions and CapabilitiesI
Functional Components of a Data Management System Needed to Ensure Data Quality
Dat
a-M
anag
emen
t and
R
epor
ting
Syst
em
DQA METHODOLOGY – 2 Protocols
■ The methodology for the DQA includes two (2) protocols:
Data Verifications(Protocol 2)
Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports.
2
Assessment of Data Management
Systems(Protocol 1)
Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system.
1
7
Typical Timeline
Preparation and Initiation(multiple locations)
PHASE 1
M&E Management
Unit
PHASE 2
Service Delivery Sites /
Organizations
PHASE 3
Intermediate Aggregation
levels(eg. District,
Region)
PHASE 4
M&E Management
Unit
PHASE 5
Completion(multiple locations)
PHASE 6
1. Select Indicators and Reporting Period
2. Notify Program and desk review of Program documentation
3. Assess Data Management and Reporting Systems (Part 1)
5. Trace and Verify Reported Results (Part 2)
4. Select/Confirm Service Delivery Points to be visited
6. Draft initial findings and conduct close-out meeting
7. Draft and discuss Audit Report
8. Initiate follow-up of recommended actions
■ The DQA is implemented chronologically in 6 Phases.
■ Assessments and verifications will take place at every stage of the reporting system:- M&E Management Unit- Intermediate Aggregation Level (Districts, Regions)- Service Delivery Sites.
8
PROTOCOL 1:
Assessment of Data Management Systems
9
PROTOCOL 1:
Assessment of Data Management Systems
M&E Management
Unit
Service Delivery Sites /
Organizations
Intermediate Aggregation
levels(eg. District, Region)
3. Assess Data Management and Reporting Systems (Part 1)
■ PURPOSE: Identify potential risks to data quality created by the data-management and reporting systems at:
- the M&E Management Unit;- the Service Delivery Points;- any Intermediary Aggregation Level (District or Region).
■ The DQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems.
■ The assessment covers 5 functional areas (HR, Training, Data Management Processes , etc.)
SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA
Functional Areas Summary Questions
I M&E Structure, Functions and Capabilities
1 Are key M&E and data-management staff identified with clearly assigned responsibilities?
2 Have the majority of key M&E and data-management staff received the required training?
II Indicator Definitions and Reporting Guidelines
3Has the Program/Project clearly documented (in writing) what is reported to who, and how and when reporting is required?
4Are there operational indicator definitions meeting relevant standards? And are they consistently followed by all service points?
III Data-collection and Reporting Forms and Tools
5 Are there standard data-collection and reporting forms that are systematically used?
6 Are data recorded with sufficient precision/detail to measure relevant indicators?
7 Are data maintained in accordance with international or national confidentiality guidelines?
8 Are source documents kept and made available in accordance with a written policy?
IV Data Management Processes
9 Does clear documentation of collection, aggregation and manipulation steps exist?
10 Are data quality challenges identified and are mechanisms in place for addressing them?
11 Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports?
12 Are there clearly defined and followed procedures to periodically verify source data?
V Links with National Reporting System
13 Does the data collection and reporting system of the Program/project link to the National Reporting System?
SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA
Functional Areas Summary Questions
13
PROTOCOL 2:
Data Verifications
14
■ PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and reporting data accurately and on time.
■ The trace and verification exercise will take place in two stages:
- In-depth verifications at the Service Delivery Points; and - Follow-up verifications at the Intermediate Aggregation Levels (Districts,
Regions) and at the M&E Unit.
PROTOCOL 2:
Data Verifications
M&E Management
Unit
Service Delivery Sites /
Organizations
Intermediate Aggregation
levels(eg. District, Region)
5. Trace and Verify Reported Results (Part 2)
DQA Data Verifications
Service Delivery Site 5
Monthly Report
ARV Nb. 50
Service Delivery Site 6
Monthly Report
ARV Nb. 200
Source Document 1
Source Document 1
District 1
Monthly Report
SDS 1 45
SDS 2 20
TOTAL 65
District 4
Monthly Report
SDP 5 50
SDP 6 200
TOTAL 250
District 3
Monthly Report
SDS 4 75
TOTAL 75
M&E Unit/National
Monthly Report
District 1 65
District 3 75
TOTAL 435
District 4 250
ILLUSTRATION
Service Delivery Site 3
Monthly Report
ARV Nb. 45
Source Document 1
Service Delivery Site 4
Monthly Report
ARV Nb. 75
Source Document 1
Service Delivery Site 1
Monthly Report
ARV Nb. 45
Source Document 1
Service Delivery Site 2
Monthly Report
ARV Nb. 20
Source Document 1
District 2
Monthly Report
SDS 3 45
TOTAL 45
District 2 45
SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS
Verifications Description -
Verification no. 1:Description
Describe the connection between the delivery of services/commodities and the completion of the source document that records that service delivery.
In all cases
Verification no. 2:Documentation
Review
Review availability and completeness of all indicator source documents for the selected reporting period. In all cases
Verification no. 3:Trace and Verification
Trace and verify reported numbers: (1) Recount the reported numbers from available source documents; (2) Compare the verified numbers to the site reported number; (3) Identify reasons for any differences.
In all cases
Verification no. 4:Cross-checks
Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory reports, etc.).
If feasible
Verification no. 5:Spot checks
Perform “spot checks” to verify the actual delivery of services or commodities to the target populations. If feasible
Service Delivery Points – Data Verification
17
Total and Adjusted District Verification Factors from DQA
0,95
1,07
1,03
1.22
1,07
0,00 0,20 0,40 0,60 0,80 1,00 1,20
1
TotalVerificationFactorIAL #4
IAL #3
IAL #2
IAL #1
% Available Reports from DQA
0,80
0,50
0,92
0,83
0,75
0,00 0,20 0,40 0,60 0,80 1,00
1
M&E Unit
IAL #4
IAL #3
IAL #2
IAL #1
% On Time Reports from DQA
0.80
0.47
0.88
0.73
0.75
0.00 0.20 0.40 0.60 0.80 1.00
1
M&E Unit
IAL #4
IAL #3
IAL #2
IAL #1
% Complete Reports from DQA
0.70
0.60
0.72
0.67
0.50
0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80
1
M&E Unit
IAL #4
IAL #3
IAL #2
IAL #1
Summary Statistics of Data Verifications
18
Outputs of the DQA Implementation
■ Completed protocols and templates related to: - Protocol 1: System Assessment; and - Protocol 2: Data Verification.
■ Write-ups of observations, interviews, and conversations with key staff.
■ Preliminary findings and recommendation notes.
■ Final DQA Report, including:- Summary of the evidence collected by the DQA team;- Specific findings or gaps related to that evidence;- Recommendations to improve data-management systems and overall data quality.
The Final DQA Report should also include summary statistics on:
1. Strengths and weaknesses of data-collection and reporting systems;2. Recounting exercise performed on primary records and aggregated reports;3. Available, On time and Complete Reports.
Use of the DQA since 2008
Use of DQA (continued)
Biggest user is the Global Fund Part of the grant renewal process Random and purposive selection of grants External audit teams – DQA IQC
1st IQC 2008-2010 2nd IQC 2011-2012
Standardized output GF Reorganization
Year Qtr DiseaseHIV/AIDS TB Malaria
2008 4 Mali, Belarus Philippines
2009 1 Comoros Islands China, Yemen
Rwanda, China, LAC-Multi-country
(Americas – Andean Region)
2009 2 Haiti, Burundi Niger Ghana, Indonesia
2009 3Mozambique,
Dominican Republic, Tanzania
Gambia, Vietnam, Sri Lanka, Nigeria, Zambia+
2009 4Peru, Swaziland, South
Africa, Thailand, Guyana
Tajikistan Kenya, Cote d’Ivoire, Madagascar
2010 1 Ethiopia, Paraguay, Malawi, C.A.R. Uzbekistan Uganda, Cameroon
2010 2 Russia, Guinea-Conakry, Eritrea Pakistan, Paraguay
Papua New Guinea, Pakistan, South Sudan,
Madagascar
2011 2 Zambia Kyrgyzstan North Sudan
2011 4Angola, Cambodia,
Namibia, Guyana, India, Myanmar, Ukraine
North Korea Senegal, South Sudan, DRC
Global Fund DQAs 2008-2012
GF Audits by Indicator (2008-2010)Disease Program Area Number of Audits
Malaria Malaria Treatment 13HIV Current on ART 12Malaria LLIN 10HIV C&T HIV 9HIV Care and support 6TB TB Detection 5HIV PMTCT – preg. women on prophylaxis 5Malaria Malaria Training 4TB TB Treatment 4HIV HIV Education 4Malaria IPT 3HIV Condoms 3Malaria Malaria Treatment - Community Based 2Malaria Malaria Testing 2HIV OVC 2HIV HIV Training 1TB MDR TB Treatment 1HIV TB/HIV 1Total 87
Use of DQA (continued)
PEPFAR Countries 2009 Country Operational Plan Guidance Mozambique, Ukraine, DRC, Uganda, Nigeria, Cote
d’Ivoire, Dominican Republic
Ad hoc use by Partners, other agencies UNICEF in India
Adaptation for local use South Africa (ESI)
New DQA Dashboards
National Level:
Performance Table:
Sample Accuracy 77%
OSDV Accuracy 92%
Availability 74%
Timeliness 52%
Completeness 84%
Cross-Checks 79%
Service Site Documentation Review
65%
M&E Structure, Functions and Capabilities
2.40
Indicator Definitions andReporting Guidelines
2.50
Data-collection and Reporting Forms / Tools
2.30
Data Management Processes
1.90
Links with National Reporting System
2.80 Excellent Poor
Good Very Poor
Fair
Qua
litat
ive
- Ent
er th
ese
valu
es
= PS < 60.0
= PS 71.0 - 80.0
DQA Performance Analysis
Valu
es p
rovi
ded
auto
mat
ical
ly
= PS 61.0 - 70.0 = PS 91.0 - 100.0
= PS 81.0 - 90.0
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Performance Scores
76.5%
30%
40%
50%
60%
70%
80%
90%
100%
Weighted Final Audit Performance Score
Need expressed for a tool for programs/projects to use for self assessment, improved data quality, capacity building and to prepare for external audits
New generic, simplified tool adapted to the country, program and/or project
Routine use Quantify and track data quality performance
over time Identify and respond to data quality problems
Evolution of RDQA tool
Auditing version…
Assessment by funding agency
Standardized approach to auditing
Conducted by an external audit team
Relatively expensive
Limited opportunities for capacity building
Capacity Building version…
Simplified from the auditing version
Flexible use by program
Self-assessment
Focus is on identifying weaknesses for system strengthening
Should be used routinely – data quality performance indicators monitored over time
Use of the Auditing and Self Assessment/Routine Versions
RDQA Modifications
No implied sampling methodology No aggregate sample statistics Quantitative and qualitative assessments merged
into the same tool Dashboards by site and level to facilitate data use Easily extract data to link with data systems Action plans for system strengthening
Objectives of the RDQA
OBJECTIVES:
• VERIFY• the quality of reported data for key indicators at selected
sites; and the ability of data-management systems to collect, manage and report quality data.
• IDENTIFY• weaknesses in the data management system and• interventions for system strengthening.
• MONITOR • data quality performance over time and • capacity to produce good quality data
The RDQA is designed to be flexible in use and can serve multiple purposes:
Routine data quality checks as part of on-going supervision
Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time:
Strengthening program staff’s capacity in data management and reporting:
Preparation for a formal data quality audit· External assessment by partners, other stakeholders
Uses of the RDQA
1. Strength of the M&E Systemevaluation based on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented;
2. Verification Factors generated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value);
3. Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Levels and the M&E Unit).
4. Action Plan for System Strengthening for each site and level assessed.
RDQA Outputs
RDQA Dashboards
RDQA Summary Statistics – Level Specific Dashboard
Part 4: DASHBOARD: National Level - M&E Unit
Data Management Assessment - National Level - M&E Unit
0
1
2
3
I - M&E Structure, Functions and
Capabilities
II- Indicator Definitions and
Reporting Guidelines
III - Data-collection and Reporting Forms
and Tools
IV- Data Management Processes
V - Links with National
Reporting System
Data and Verifications - National Level - M&E Unit
94%
106%102% 105%
0%
20%
40%
60%
80%
100%
120%
ANC Registered ANC Counseled andTested
ANC ReceivedResults
ANC Tested Positive
Reporting Performance - National Level - M&E Unit
92%
41%
82%
0% 20% 40% 60% 80% 100%
% Available
% On Time
% Complete
RDQA Summary
Statistics – Global
Dashboard
Global Dashboard
Data Management Assessment - Overall Average
0
1
2
3
M&E Structure, Functions and
Capabilities
Indicator Definitions and Reporting
Guidelines
Data-collection and Reporting Forms / Tools
Data Management Processes
Links with National Reporting
System
Data Verifications - Overall Average by Indicator
90%
112% 112%
124%
0%
20%
40%
60%
80%
100%
120%
140%
ANC Registered ANC Counseledand Tested
ANC ReceivedResults
ANC TestedPositive
System Assessment Results by Level of the Reporting System
0.00
0.50
1.00
1.50
2.00
2.50
3.00
National M&E Unit - Cotonou
M&E Structure, Functions and Capabilities
Indicator Definitions and Reporting Guidelines
Data-collection and Reporting Forms / Tools
Data Management Processes
Links with National Reporting System
Data Verifications - Verification Factors by Level of the Reporting System
0
0.2
0.4
0.6
0.8
1
1.2
1.4
Service Site Average District Average Regional Average National M&E Unit - Cotonou
ANC Registered ANC Counseled and Tested ANC Received Results ANC Tested Positive
Reporting Performance - Overall Average
86%
51%
132%
0% 20% 40% 60% 80% 100% 120% 140%
Availability
Timeliness
Completeness
Reporting Performance by Reporting Level
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
District Average Regional Average National M&E Unit - Cotonou
Availability Timeliness Completeness
RDQA Summary
Statistics – Site
specific qualitative
results
RDQA Output - System Strengthening Action Plan
Data Export Tab – Easily Integrate Results from different workbooks
Growing family of RDQA Tools RDQA traditional version:
one indicator per workbook, x-sectional
RDQA Multi-indicator version: up to four indicators for the same program area in x-section
RDQA Longitudinal version: one indicator for up to four points in time.
RDQA training materials PMTCT and TB RDQA
Data Verifications - Service Site Summary Trend in Verification Ratio by Indicator
0
0.5
1
1.5
2
2.5
3
3.5
Q1 Q2 Q3 Q4
Percent Accuracy
Hea
lth F
acili
tyCS Parakou Maternite de bonne saveur Parakou
CS Immacule Conception de Kalale Hopital St. Louis de Kalale
Hopital Regionale de Savalou CS de Zongo - Savalou
Centre Hopitalier de Grand Quelqu'un d'Abomey CS Behanzin de Abomey
Countries where RDQA has been used or is currently being implemented
Nigeria Kenya Tanzania Cote d’Ivoire Guyana Haiti Mozambique Zimbabwe Uganda Dominican Republic Peru Paraguay
• South Africa• Lesotho• Swaziland• DRC• India• Botswana• Global Fund On Site
Data Verification (OSDV) by LFAs in many countries
Now we know the quality of the data. Now what? Standardization:
Indicators data collection and reporting tools data management and indicator compilation practices
Capacity building to address identified needs MEval Data quality assurance curriculum
Formalization of RDQA into routine practice Monitor data quality performance indicators over time. What
is the trend?
Integrating Routine Data Quality Assurance into Standard Operating Procedures
Explicitly State: Who will do what? When they will do it? What is the output? Where will it happen? What is the required follow up? Who will verify that it is happening?
Formalize in official Program documentation Incorporate into pre-service/in-service training
Integration of RDQA into SOPs Example from
Botswana
Key Success Factors for Data Quality
1. Functioning information systems
2. Clear definition of indicators consistently used at all levels
3. Description of roles and responsibilities at all levels
4. Specific reporting timelines
5. Standard/compatible data-collection and reporting forms/tools with clear instructions
6. Documented data review procedures to be performed at all levels
7. Steps for addressing data quality challenges (missing data, double-counting, lost to follow up, …)
8. Storage policy and filling practices that allow retrieval of documents for auditing purposes (leaving an audit trail)
MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) through Cooperative Agreement
GPO-A-00-03-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina in partnership
with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University.
Visit us online at http://www.cpc.unc.edu/measure.