Reference Materials - Nigeria Monitoring and Evaluation ...

flipenergySoftware and s/w Development

Oct 30, 2013 (3 years and 11 months ago)

133 views









TAB 6


Reference Materials




REFERENCE MATERIALS


1.

USAID Assessing and Learning
, Automated Directives System, (ADS) 203,
http://www.usaid.gov/policy/ads/200/203.pdf


2.

USAID

Checklist for
Assessing USAID Evaluation Reports
,
http://www.usaid.gov/policy/evalweb/ChecklistforAssessingUSAIDEvaluationReportsn.docx


3.

USAID Checklist for Reviewing S
copes of Work for Performance Evaluations,
http://www.usaid.gov/policy/evalweb/EvaluationSOWChecklist.doc


4.

USAID


“F” Checklist for Reviewing Foreign Assistance Evaluations,
Annex 2 of

A
Meta

Evaluation of Foreign Assistance Evaluations
, Annex 2, Krishna Kumar and John
Eriksson,
Office of the Director of U.S. Foreign Assistance, Washington, DC, June 1, 2011,
http://pdf
.usaid.gov/pdf_docs/PCAAC273.pdf


5.

USAID Evaluation Policy, Evaluation: Learning from Experience,
January 2011
http://www.usaid.gov/evaluation/USAIDEvaluationPolicy.pdf


6.

USAID Glossary of Evaluation Terms,
March 2009,

http://pdf.usaid.gov/pdf_docs/PNADO820.pdf


7.

USAID Handbook of Democracy and Governance Program Indicators, Center For
Democracy And Governance,
Augus
t 1998
,
http://www.usaid.gov/our_work/democracy_and_governance/publications/pdfs/pnacr211.pdf


8.

USAID Introduction to Programming Policy
, ADS 200,
http://www.usaid.gov/policy/ads/200/200.pdf

9.

USAID Performance Management Toolkit:
A Guide to Developing and Implementing
Performance Management Plans
,
IBM Business Consulting Services
,

10.

http://www.usaid.gov/policy/ads/200/200sbn.pdf
. This includes the original
Performance
Indicator Reference Sheet
(PIRS)
Worksheet 6

for USAID Performance Management Plans


11.

USAID Planning
, ADS 201,
http://www.usaid.gov/policy/ads/200/201.pdf

12.

USAID Project Design Guidance,
December 9, 2011,
http://www.usaid.gov/our_work/policy_planning_and_learning/documents/PD_Guidance_Fi
nal.pdf


13.

USAID Monitoring and Evaluation TIPS

provide practical advice and suggestions to
USAID managers and partners on issues related to performance monitoring and evaluation.
These publications are supplemental references to the Automated Directives Service (ADS),

http://www.usaid.gov/performance/program
-
performance/TIPS/
:

TIPS 1
: Conducting a Participatory Evaluation
-

Participatory evaluation provides for
active involvement in the evaluation process of those with a stake in

the program: providers,
partners, customers (beneficiaries), and any other interested parties. Participation typically
takes place throughout all phases of the evaluation: planning and design; gathering and
analyzing the data; identifying the evaluation f
indings, conclusions, and recommendations;
disseminating results; and preparing an action plan to improve program performance.

TIPS 2
: Conducting Key Informant Interview
-

They are qualitative, in
-
depth interviews of
15 to 35 people selected or their first
-
hand knowledge about a topic of interest. The
interviews are loosely structured, relying on a list of issues to be discussed. Key informant
interviews resemble a conversation among acquaintances,
allowing a free flow of ideas and
information. Interviewers frame questions spontaneously, probe for information and take
notes, which are elaborated on later.


TIPS 3
: Preparing and Evaluation Sc
ope of Work
-

The statement of work (SOW) is
viewed as the single most critical document in the development of a good evaluation. The
SOW states (1) the purpose of an evaluation, (2) the questions that must be answered, (3) the
expected quality of the evalu
ation results, (4) the expertise needed to do the job and (5) the
time frame and budget available to support the task.


TIPS 4
: Using Direct Observation Techniques:

Most evaluation teams conduct s
ome
fieldwork, observing what's actually going on at assistance activity sites. Often this is done
informally, without much thought to the quality of data collection. Direct observation
techniques allow for a more systematic, structured process, using well
-
designed observation
record forms.


TIPS 5
: Using Rapid Appraisal Methods:

Rapid Appraisal (RA) is an approach that draws
on multiple evaluation methods and techniques too quickly, yet systematically, collect data
when time in the field is limited. RA practices are also useful when there are budget
constraints or limited availab
ility of reliable secondary data. For example, time and budget
limitations may preclude the option of using representative sample surveys.


TIPS 6
: Selecting Performance Indicators:

Performance in
dicators define a measure of
change for the results identified in a Results Framework (RF). When well
-
chosen, they
convey whether key objectives are achieved in a meaningful way for performance
management. While a result (such as an Assistance Objective or

an Intermediate Result)
identifies what we hope to accomplish, indicators tell us by what standard that result will be
measured. Targets define whether there will be an expected increase or decrease, and by what
magnitude.


TIPS 7
: Preparing a PMP:

This TIPS provides the reader with an overview of the purpose
and content of a Performance Management Plan (PMP). It reviews key concepts for effective
performance management and outlines practical step
s for developing a PMP.


TIPS 8
: Baselines and Targets:

The achievement of planned results is at the heart of
USAID's performance management system. In order to understand where we, as project
man
agers, are going, we need to understand where we have been. Establishing quality
baselines and setting ambitious, yet achievable, targets are essential for the successful
management of foreign assistance programs.


TIPS 9
: Conducting Costumer Service Assessments:

A customer service assessment is a
management tool for understanding USAID's programs from the customer's perspective.
Most often these assessments seek feedback from customers about a pro
gram's service
delivery performance. The Agency seeks views from both ultimate customers (the end
-
users,
or beneficiaries, of USAID activities
-
usually disadvantaged groups) and intermediate
customers (persons or organizations using USAID resources, service
s, or products to serve
the needs of the ultimate customers).


TIPS 10
: Conducting Focus Group Interviews:

A focus group interview is an inexpensive,
rapid appraisal technique that can provide man
agers with a wealth of qualitative information
on performance of development activities, services, and products, or other issues. A
facilitator guides 7 to 11 people in a discussion of their experiences, feelings, and preferences
about a topic. The facilit
ator raises issues identified in a discussion guide and uses probing
techniques to solicit views, ideas, and other information.


TIPS 11
: Introduction to Evaluations at USAID:

This TIPS will provi
de the reader with a
general introduction to the purpose, role, and function of evaluation in the USAID program
and project design and implementation cycle. It will provide background on why evaluation
has become an important part of the effort to improve
the effectiveness of foreign assistance
programming. It will also provide links to other TIPS with more detailed guidance on when
and why to evaluate, how to evaluate, uses of evaluation data, how to address common
problems, and how to structure the evalua
tion's findings, conclusions, and recommendations.


TIPS 12
: Data Quality Standards:

Data quality is one element of a larger interrelated
performance management system. Data quality flows from a w
ell
-
designed and logical
strategic plan where Assistance Objectives (AOs) and Intermediate Results (IRs) are clearly
identified. If a result is poorly defined, it is difficult to identify quality indicators, and further,
without quality indicators, the res
ulting data will often have data quality problems.


TIPS 13
: Building a Results Framework:

The Results Framework (RF) is a graphic
representation of a strategy to achieve a specific objective that

is grounded in cause
-
and
-
effect logic. The RF includes the Assistance Objective (AO) and Intermediate Results (IRs),
whether funded by USAID or partners, necessary to achieve the objective (see Figure 1 for
an example). The RF also includes the critical a
ssumptions that must hold true for the
strategy to remain valid.


TIPS 14
: Monitoring the Policy Reform Process:

The discussion and examples in this
paper are organized around the issues and chall
enges that USAID's development
professionals and their clients/partners face when designing and implementing systems to
monitor the policy reform process.


TIPS 15
: Measuring Institutional Capacit
y:

This PME Tips gives USAID managers
information on measuring institutional capacity,* including some tools that measure the
capacity of an entire organization as well as others that look at individual components or
functions of an organization. The discu
ssion concentrates on the internal capacities of
individual organizations, rather than on the entire institutional context in which organizations
function.


This TIPS

is not about how to actually strengthen an institution, nor is it about how
to assess the

eventual impact of an organization's work. Rather, it is limited to a specific
topic: how to measure an institution's capacities.


TIPS 16
: Mixed Methods:

This TIPS provides guidance on using a mixed
-
methods
approach for evaluation research. Frequently, evaluation statements of work specify that a
mix of methods be used to answer evaluation questions. This TIPS includes the rationale for
using a mixed
-
metho
d evaluation design, guidance for selecting among methods (with an
example from an evaluation of a training program) and examples of techniques for analyzing
data collected with several different methods (including ?parallel analysis).


TIPS 17
: C
onstructing
an Evaluation Report:

This TIPS has three purposes. First, it
provides guidance for evaluators on the structure, content, and style of evaluation reports.
Second, it offers USAID officials, who

commission evaluations, ideas on how to define the
main deliverable. Third, it provides USAID officials with guidance on reviewing and
approving evaluation reports.


TIPS 18
: Conducting Data Qual
ity Assessments:

Data quality assessments (DQAs) help
managers to understand how confident they should be in the data used to manage a program
and report on its success. USAID's ADS notes that the purpose of the Data Quality
Assessment is to: "…ensure that

the USAID Mission/Office and Assistance Objective (AO)
Team are aware of the strengths and weaknesses of the data, as determined by applying the
five data quality standards …and are aware of the extent to which the data integrity can be
trusted to influen
ce management decisions."

TIPS 19
: Impact Evaluations:

Rigorous impact evaluations are useful for determining the
effects of USAID programs on outcomes. This type of evaluation allows managers to test
development hypotheses by comparing changes in one or
more specific outcomes to changes
that occur in the absence of the program. Evaluators term this the counterfactual. Rigorous
impact evaluations typically use comparison groups, composed of individuals or
communities that do not participate in the program.

The comparison group is examined in
relation to the treatment group to determine the effects of the USAID program or project.


Un
-
numbered TIPS
: Constructing and Evaluation Report

(Management Sys
tems
International (MSI), April 2006) This TIPS provides a general guide to the preparation of an
evaluation report. For evaluators, the guide sets out an
annotated outline

of a generic report,
identifying the order of presentation and the types of infor
mation most USAID readers
expect to find in each section. For general USAID readers, the guide will be
helpful in
developing an evaluation statement of work
.


= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

14.

A Meta Evaluation of Foreign
Assistance Evaluations
,
Krishna Kumar and John Eriksson,
Office of the Director of U.S. Foreign Assistance, Washington, DC, June 1, 2011

http://pdf.usaid.gov/pdf_docs/PCAAC273.pdf


15.

Baseline Studies
, AusGuideline: Activity Implementation, Australian Aid (AusAID),
October 2005,
http://www.ausaid.gov.au/ausguide/pdf/ausguideline4.2.pdf


16.

“Beyond Success Stories: Monitoring and Eva
luation for Foreign Assistance Results”
,
Richard Blue, Cynthia Clapp
-
Wincek, and Holly Benner
, May

2009,
http://mande.co.uk/blog/wp
-
content/uploads/2009/06/beyond
-
success
-
stories
-
full
-
report.pdf


17.

DAC Criteria for Evaluating Development Assistance,
Development Assistance
Committee, Organization for Economic Cooperation and Development (DAC/OECD),
http://www.oecd.org/document/22/0,2340,en_2649_34435_2086550_1_1_1_1,00.html


18.

Data Analysis,
author unknown,
http://www.nationalserviceresources.org/filemanager/download/Evaluation/users_guide/data
anal.pdf


19.

Data Collection Methods (Module 8),
International Program for Development Evaluation
Training (
IPDET), World Ban
k, PowerPoint Presentation,

http://www.worldbank.org/oed/ipdet/presentation/M_08
-
Pr.pdf


20.

Data Systems Analysis Worksheet Template,
http://www.nigeriamems.com/resources/m_tools/DSA
-
Worksheet
-
Template.doc


21.

Descriptive, Normative, and Impact Evaluation Designs
, World Bank an
d
International
Program for Development Evaluation Training,
http://www.worldbank.org/oed/ipdet/presentation/M_06
-
Pr.pdf


22.

Developing Data Collection Instruments
, World Bank,
http://siteresources.worldbank.org/NUTRITION/Resources/Tool8
-
chap8.pdf


23.

Evaluating Development Co
-
Operation: Summary of Key Norms and Standards
(Second Edition),

Organization for Economic Cooperation and Development (OECD),
Development As
sistance Committee (DAC), Network On Development Evaluation,
http://www.oecd.org/dataoecd/12/56/41612905.pdf

24.

Evaluation Glossary,
Office of the Director of U.S. Foreign Assistance, March 25,
2009

https://communities.usaidallnet.gov/fa/system/files/FA+Evaluation+Glossary_March+25_09.
pdf


25.

Evaluation Guidelines for Foreign Assistance,
Office

of the Director of U.S. Foreign
Assistance, March 25, 2009
https://communities.usaidallnet.gov/fa/system/files/FA+Evaluation+Guidelines_March+25_0
9.pdf


26.

Evaluation of Recent USAID Evaluation Experience,
Cynthia Clapp

Wincek and Richard
Blue, USAID, June 2001
,

https://communities.usaidallnet.gov/fa/system/files/SUMMAR~1.PDF


27.

Evaluation Plan Workbook,
Innovation Network,
http://www.innonet.org/client_do
cs/File/evaluation_plan_workbook.pdf


28.

Evaluation Standards,
Office of the Director of U.S. Foreign Assistance, March 25, 2009

https://communities.us
aidallnet.gov/fa/system/files/FA+Evaluation+Standards_March+25_0
9.pdf


29.

Glossary of Key Terms in Evaluation and Results Based Management,
Organization for
Economic Cooperation and Development (OECD),

Paris, 2002
http://www.oecd.org/dataoecd/29/21/2754804.pdf


30.

Guidelines for Financial Analysis of Activities,
USAID Financial Analysis ADS
Supplemental Reference 2026s5 (Includes guidelines for cost
-
effectiveness analysis)
http://www.usaid.gov/policy/ads/200/2026s5.pdf


31.

How to Perform Evaluations
, Canadian International Development Agency (CIDA),
http://www.acdi
-
cida.gc.ca/acdi
-
cida/acdi
-
cida.nsf/eng/EMA
-
218131657
-
PG4

(10

short
guides)


32.

Impact Evaluations and Development, NONIE Guidance on Impact Evaluation

(
Network of Networks for Impact Evaluation),
World Bank,
Frans Leeuw and Jos Vaessen

http://siteresources.worldbank.org/EXTOED/Resources/nonie_guidance.pdf


33.

Impact Evaluation in the Absence of Baseline Surveys
,
Fabrizio Felloni, Office of
Evaluation,
Inter
national Food and Agricultural Development (IFAD), PowerPoint
Presentation, November 2006,
http://www.google.com/url?q=http://www.oecd.org/dataoecd/41/20/37690023.ppt&sa=U&ei
=vQlGT4PqFuPSiALz1KnaDQ&ved=0CAYQFjAB&client=internal
-
uds
-
cse&usg=AFQjCNERC1U0a8HmCMzW9_ySErUfVqYiRg


34.

Introduction to Sampling,
Mari Jack,

PowerPoint Presentation
(Searchable on Google)



35.

Introduction to Lot Quality Assurance Sampling Basic Principles, Lecture #1
,
Monitoring & Evaluation Working Group
-

The CORE Group, February 14, 2006,
http://gametlibrary.worldbank.org/FILES/1339_LQAS_Lecture_1.pdf


36.

Logical Framework, A Manager’s Guide to A Scientific Approach to Design and
Evaluation,
Practical Concepts Incorporated (PCI), November 1979,
http://pdf.usaid.gov/pdf_docs/PNABN963.pdf


37.

Logical

Framework Project Example: Refocusing My Career
, Haines Centre for
Strategic Management, undated,
http://www.managementpro.com/styles/pdfs/LFX
-
Refocusing
-
My
-
Career.pdf


38.

Methods of Data Analysis in Qualitative Research,
compiled by Donald Ratcliff,

http://qualitativeresearch.ratcliffs.net/15methods.pdf


39.

Participatory Program Evaluation Manual: Involving Program Stakeholders in the
Evaluation Process
, Judi Aubel, Child Survival Technical Support Project and Catholic
Relief Services
http://www.coregroup.org/storage/Monitoring__Evaluation/PartEvalManualEnglish.pdf


40.

Policy Brief: Monitoring & Evaluation For Results
-

The Role of M&E in U.S. Foreign
Assistance Reform
,
Richard Blue, Cynthia Clapp
-
Wincek, and Holly Benner
, May

2009,
http://mande.co.uk/blog/wp
-
content/uploads/2009/06/policy_brief_
-
_me_for_foreign_assistance_results
-
1.pdf


41.

Preparing for the Evaluation Guidelines and Tools For Pre
-
Evaluation Planning,
Della
E. McMillan and Alice Willard,

February 2006,

http://www.redcross.org/www
-
files/Documents/International%20Services/file_cont5975_lang0_2295.pdf
. See also

http://www.crsprogramquality.org/storage/pubs/me/MEshortcut_preevaluation.pdf

(Short
Cuts)


42.

Program Evaluation, What Is It?
, author unknown, PowerPoint presentation.
http://www.powershow.com/view/24cc79
-
OWQ4O/Program_Evaluation_What_is_it_flash_ppt_presentation


43.

Ten Steps to a Results
-
based Monitoring and Evaluation System: A Handbook for
Development Prac
titioners,
Jody Zall Kusek and Ray C. Rist, World Bank, 2004

http://www.oecd.org/dataoecd/23/27/35281194.pdf


44.

The 2010 User
-
Friendly Handbook for Project Evaluation,
National Science Foundation,
December 2010,
http://www.westat.com/pdf/projects/2010ufhb.pdf



45.

The Road to Results: Designing and Conducting Effective Development Evaluations,
Linda Morra Imas
and Ray Rist, The World Bank, 2009
,
http://issuu.com/world.bank.publications/docs/9780821378915

; and

video clip

http://
www.youtube.com/watch?v=V84hHZn12qE


46.

(Book Review,
The Road to Results: Designing and Conducting Effective Development
Evaluations,
International Development Evaluation Association (IDEA),
www.ideas
-
int.org/documents/document.cfm?docID=392


47.

The Programme Managers Planning, Monitoring and Evaluation Toolkit
, United
Nations Population Fund (UNFPA),

http://www.unfpa.org/monitoring/toolkit.htm


48.

User
-
Friendly Handbook for Mixed Method Evaluations,

National Science Foundation,
August 1997,
http://www.nsf.gov/pubs/1997/nsf97153/start.htm


49.

Who

are the Question Makers? A Participatory Evaluation Handbook
, Jennie Campos
and Francoise Coupal, UNDP, 1997,
http://www.undp.org/evaluation/documents/who.htm