1
[INSERT THE NAME OF THE ACTIVITY]
ACTIVITY MONITORING,
EVALUATION, AND LEARNING PLAN
(AMELP)
Cover photo by: John Doe Rural Institutions Activity (Perú, 2014). [Cover photo and this reference are for
demo purpose only. Please replace or leave blank.]
All photos are the property of USAID and Partner unless otherwise specified.
GUIDANCE OF THE AMELP
This document gives USAID/Colombia Implementing Partners (IPs) guidance on the content
expected for each one of the sections of the Activity Monitoring, Evaluation and Learning
Plan (AMELP) and indicates relevant references to be consulted during its construction.
Attached to this document, USAID will include the template of the AMELP to be filled
according to the guidance. The reference sheets and illustrative tables included in this
document will also be provided as attachments.
When using this document as template the final content of the AMELP should not include the
guidance narratives provided.
While there is no contractually required format for partners to use in constructing the
AMELP, this template can help USAID and its IPs ensure this plan meets the standards of
USAID monitoring, evaluation and learning policies and directives.
2
[INSERT THE NAME OF
ACTIVITY]
ACTIVITY MONITORING,
EVALUATION, AND LEARNING
PLAN (AMELP)
Draft or Final Version
Date: [January 2019]
This publication was produced and reviewed by the joint participation of the United States Agency for
International Development (USAID) and the Ipartner Inc., through XXXXXX USAID Activity Contract
No. AID-xxx-C-yyy-000000.
Prepared by: John Doe (Imp. Partner MEL Specialist), Adriana Adrian (COP) & Milly Moe
(USAID/Col M&E Specialist)
Principal Contacts: Mark Marcus COP / Program Manager
Elizabeth Elli Chief Operations Officer
Jhon Doe Ipartner M&E Specialist
ACRONYMS
ADS Automated Directives System
AMEP Activity M&E Plan
AOR Agreement Officer Representative
CDCS Country Development Cooperation Strategy
COP Chief of Party
COR Contract Officer Representative
DO Development Objective
DQA Data Quality Assessment
GIS Geographic Information System
GOJ Government of Vietnam
IM Implementing Mechanism
IPs Implementing Partners
IR Intermediate Result
LF Logical Framework
M&E Monitoring & Evaluation
MIS Management Information System
PAD Project Appraisal Document
PIRS Performance Indicator Reference Sheet
PMP Performance Management Plan
RF Results Framework
SOW Statement of Work
USAID United States Agency for International Development
USG United States Government
ii
TABLE OF CONTENTS
1. OVERVIEW ................................................................................................................ 3
1.1. Introduction ........................................................................................... 3
1.2. Activity Theory of Change and Results Framework ............................... 3
1.3. Relationship to mission CDCS .............................................................. 5
2. MONITORING SECTION.............................................................................................. 5
2.1. Monitoring Approach ............................................................................. 5
2.2. Performance Monitoring ........................................................................ 5
2.3. Context monitoring .............................................................................. 12
2.4. Complementary Monitoring Approaches ............................................. 13
3. EVALUATION PLAN .................................................................................................. 13
4. LEARNING PLAN ...................................................................................................... 14
5. MANAGING DATA SECTION ..................................................................................... 16
6. ROLES AND responsibilities SECTION ....................................................................... 18
7. ACTIVITY PIRS AND CIRS .......................................................................................... 19
8. ANNEXES ................................................................................................................ 19
1. OVERVIEW
Guidance: This section is composed of an introduction; the presentation of the theory of
change; the results framework of the Activity; and the relationship to the USAID/Colombia
Country Development Cooperation Strategy (CDCS).
Note A: In USAID, the term “Activity” refers to the entire intervention of an awarded
implementing mechanism. Hence along the AMELP, avoid using the terms “project” or
“program” when referring to the whole implementing mechanism.
1.1. Introduction
Guidance: In maximum two pages, this section should provide clear and precise
descriptions on the following topics:
Intended purposes and uses of this plan: The foremost purpose of the AMELP is
to serve as a tool for both USAID and the IP to know whether the Activity is making
progress toward stated results. This topic embraces also how data and information
proposed in the plan is intended to be managed, maintained and available for use.
Succinct description of the Activity: This topic refers to the main objectives of the
Activity; how it is relevant and contributes to the USAID Objectives
1
; the starting and
ending dates; the timeline and intended geographic scope (map suggested); a brief
description of the intended relationship with the official entities of the Government of
Colombia (national, regional and/or local), as well as civil society organizations, and
the private sector.
Succinct description of the context in which the Activity starts operating: This
topic refers the context in which the Activity takes place, composed by factors and
conditions out of the direct control of USAID or the IP but directly affecting the
development of the implementation.
Note B: The AMELP should be revised as needed in response to changes in the Activity or context
that occur during the life of the activity. Changes of the AMELP must be agreed with the AOR/COR of
the Activity who will approve them.
1.2. Activity Theory of Change and Results Framework
Guidance: In maximum four pages, this Section should provide clear and precise
descriptions of the following elements of the Activity:
The Theory of Change (TOC): The TOC is a narrative description on how and why
the purpose of the Activity will be achieved in its particular context. It is based in a
development hypothesis grounded on a development theory, practice, literature, and
experience. It explains why and how the proposed investments from USAID and
others collectively contribute to or lead to the achievement of the Activity’s purpose.
The Results Framework (RF): The RF is a predominant logic model that represents
the TOC of the Activity. It shows the results the Activity expects to contribute to, or
achieve, during its implementation. It diagrams the development hypotheses,
outlining the logic for achieving the objectives over time. It shows the cause and
effect relation between the different levels of objectives or intended results of the
1
This may be a general reference to what will be exposed in sub-section 1.3
Activity (purpose, components o intermediate results, etc.). Chart 1 illustrates an
example of a simple RF.
Landscape of key assumptions: This landscape exposes, explains and qualifies
relevant critical assumptions and risks linked to the achievement of the Activity’s
objectives in the frame of the RF. Assumptions and risks lye beyond the IP or
USAID’s control. Key assumptions must be linked in a logic manner to the different
levels of expected objectives. Qualitative PESTLE Analysis
2
and “game changing”
scenarios are recommended as starting point when constructing the landscape of
key assumptions.
The process of developing the theory of change should be a participatory process involving
broad engagement with local stakeholders and if possible with potential beneficiary
communities. It also implies a series of dynamic critical-thinking exercises to examine the
body of evidence, draw out different viewpoints, and reach consensus on the best possible
approach given the available information.
Logic consistency going through each level of the RF should be checked from base to top.
Each lower level result presented should have both the necessary results and meet the key
assumptions to cause expected progress advancing to the next higher level. The results
included at each level of the activity should be written in appropriate “results” language
(simple assertive past tense). The results statement should express an outcome
3
as
opposed to describing actions or processes to achieve them. They should be clear, specific
and precise, describing a single and measurable discrete result.
Chart 1: Illustrative Diagram of a Results Framework
The RF is not a complete representation of the full Activity. It is a snapshot to be used for
purposes of planning, implementation, and communication that are supported by
accompanying narratives of the TOC and development hypotheses on which the Activity is
constructed. On the other hand, in agreement with the AOR/COR of the Activity (and in
frame of the contract/ agreement/grant), the Results Framework may be updated during
implementation to respond to new evidence or changes in context.
2
PESTLE is an acronym standing for: Political, Economic, Social, Technological, Legal, and Environmental variables likely
affecting the execution of a strategy. The effect is scored based on the risk or likelihood as high, medium, or low.
3
3
Outcomes refers to behaviors and conditions of people, systems and institutions or high-level achievements of performance
such as efficiency
Purpose:
At-Risk Youth
Stay in School
Component A:
At-Risk Youth
Benefit from Tutoring
Student-Friendly
Tutoring Curriculum
Developed
Tutors Trained to
Reach At-Risk Youth
At-Risk Youth
Recruited
Component B:
Attitudes Towards Dropping
Out Changed
Student Designed
Anti-Drop Out
Campaigns
Implemented
Intensive Out-Reach
to Leaders of High-
Risk Social Groups
Completed
Component C:
Drug Use Declines
Increase police
surveillance of drug
hotspots
Anti-drug use
campaign
implemented
1.3. Relationship to mission CDCS
Guidance: In no more than two pages, this section should include:
Succinct narrative: A brief description of how results of the RF contribute to the
USAID/Colombia/CDCS. The activity and its intended results should be easily
understood within the strategy context and intended results. Thus the Activity should
be analyzed in the context of the relevant Mission Development Objective (DO),
Intermediate Results (IRs) and Sub-IRs (where applicable) of the USAID/Colombia
CDCS. The emphasis should be done at the linkage of the Activity’s objectives, with
the relevant strategy Intermediate Result(s) the Activity is framed into.
Graphic representation of the alignment or linkage between the Activity’s Objectives
at all levels and the relevant Intermediate Results(s).
Example of narratives to start off this section:
“The <Activity name> is expected to contribute to USAID’s Development Objective #<X>: Inclusive Economic
Development Accelerated.” (Explain here which CDCS Intermediate Results (IRs) are equivalent to the
activity/project of the Activity). “In particular, it will contribute to IR1.1 “Private sector competitiveness increased.”
In general, this Activity contributes to the overall USAID/Colombia Strategy (CDCS) by […] helping to a more
inclusive development for a sustainable peace”.
2. MONITORING SECTION
Monitoring is the ongoing and systematic tracking of data or information relevant to the
Activity. This section should describe and explain how the IP will monitor the performance
and context of the Activity.
2.1. Monitoring Approach
Guidance: This introduction section should explain succinctly what type of data and
information will be collected and used for monitoring the Activity and for what purpose. The
following is an illustrative text for this introduction:
Quantitative performance indicators and context indicators will be collected in a quarterly
and annual basis. Performance economic and demographic indicators as well as context
indicators will have a baseline collected during the first two quarters of implementation. Also
a combination of qualitative and quantitative information will be collected and used regarding
the perception of satisfaction from beneficiaries. This data will be collected through panel
data in annual surveys. Results of surveys will be presented annually but outside of the
Annual Report”.
2.2. Performance Monitoring
Performance monitoring is the ongoing and systematic collection of performance indicator
data and other quantitative or qualitative information to reveal whether implementation is on
track and whether expected results are being achieved.
Guidance: This section must include the indicators used for performance monitoring. The
data offered by these performance indicators is the cornerstone of the plan because it
supports adaptive management and decision-making and it is the backbone of the
accountability structure of USAID. It provides the public and external stakeholders with
information on the progress USAID is making and it delivers Washington the information
needed to inform high-level decision-making.
Performance indicators are metrics that track progress toward the intended objectives of the
Activity. The following are the definitions of the two types of performance indicators used by
USAID:
Output indicators: They measure changes of outputs that are produced as a direct
result of inputs. Outputs are the tangible, immediate, and intended products or
consequences of an Activity within USAID’s control or influence. Examples are number
of people trained, number of new hectares of crops supported and number of equipment
installed as a result the assistance provided.
Outcome indicators: They measure changes of behavior and conditions of people,
systems and institutions or high-level achievements of performance such as efficiency.
Outcomes are any result higher than an output to which a given output contributes but
for which it is not solely responsible. Outcomes may be intermediate or end outcomes,
short-term or long-term, positive or negative, direct or indirect. Outcome indicators
indicate progress or lack of progress toward the achievement of high-level objectives.
Examples are trust in institutions, the value of sales and travel reductions in time or
costs.
Changes on output indicators included in the AMELP should be within the control of the IPs
and changes in outcome indicators are expected to result from the combination of these
outputs and other factors. IPs should be accountable for changes on both output and
outcome performance indicators proposed.
The performance indicators included in the AMELP should be linked to the RF
described in the overview chapter. The main objective (or goal) and the higher-level
objectives/results (or components) of the RF should be aligned with at least one outcome
performance indicator used to measure it. Usually, the objectives/results in the bottom of the
RF have output indicators associated to them. However, all the objectives/results at the
different levels of the RF can include output indicators to track its progress. Every result in
the RF should have at least one performance indicator associated to it
4
.
Besides output and outcome performance indicators, impact indicators can also be included.
These last indicators describe the ultimate attributable goal or benefit to which the Activity is
contributing. Examples are income, wellness, awareness and economic growth. Changes on
them are not directly attributable to the Activity. They can be affected by the interventions of
the Activity in the frame of a much wider convergence of other agents’ interventions,
phenomenon and assumptions. USAID and its Implementing Partners should not be
accountable for changes on those indicators.
Note C: Criteria of utility, validity, reliability, timeliness, precision, integrity, cost and disaggregation
must be used for selecting indicators. Please refer to the USAID Monitoring toolkit for selecting
performance indicators and the USAID Performance Monitoring indicator criteria checklist.
The entire life of the activity should be considered when selecting indicators. Usually some outcome
and impact-level indicators do not become relevant until near the end of implementation, such as
those indicators monitoring higher-level results.
All the performance indicators selected must have baselines. The indicator baseline is the
value of an indicator before major implementation actions of the Activity. Baselines must be
collected before the implementation of an intervention
5
. For some indicators, the baseline
value may be zero when the Activity by itself produces entirely the effects linked to the
4
Take into account that a single performance indicator may account for more than one result.
5
If baseline data cannot be collected until later in the course of a strategy, project, or activity, the indicator’s PIRS
should document when and how the baseline data will be collected.
indicator (trainings, meetings, leveraged funds, etc.). For more detail on this topic please
refer to USAID guidance on Performance Indicators Baselines.
All the performance indicators selected must have targets. The indicator target is the
specific planned level of result to be achieved within a specific timeframe with a given level
of resources. Targets should be ambitious but achievable given USAID IPs inputs. For more
detail on this topic please refer to USAID guidance on Performance Indicators Targets. It is
expected each performance indicator will have defined yearly targets adding up to a Life-Of-
Activity (LOA) target.
Disaggregates are useful sub-categories of performance indicators. Every indicator
measuring person-level data must be disaggregated by sex
6
. At a minimum, all indicators
that could have gender aspects should be sex-disaggregated in their data collection and
reporting. This is, in the first instance, anything having to do with people involved in the
activities as beneficiaries, farmers, owners, workers, students, trainees, heads of household,
etc. When the indicator is linked to interventions affecting ethnic communities,
disaggregation must be also done so results can be identified for these population groups.
Each disaggregate of a performance indicator must have a calculated baseline. All the
performance indicators included in the AMELP and reported to USAID/Colombia, should be
disaggregated by geography (department and municipality). Other dis-aggregations are
encouraged such as “vereda” and geographic coordinates when possible. For more detail on
this topic please refer to USAID guidance Additional Help Dis-aggregating Monitoring Data.
Foreign Assistance Performance Indicators (F indicators) are standardized performance
indicators used by USAID and the Department of State to collect and analyze data coming
from countries abroad where the U.S. Government provide development assistance. F
indicators should be included in an AMELP and reported as long as the Activity does
interventions that report on them. USAID/Colombia MEL Staff will provide the updated list of
these indicators, so the IP can select those F indicators that are applicable. F indicators
should be selected over other selected indicators if those are very similar in nature. If F
Indicators are used, their ID codes assigned should also be displayed besides the ID code
assigned by the Activity into this plan. Be aware that there are specific F indicators on topics
such as Gender
7
.
The AMELP should also include existing Mission wide USAID/Colombia indicators linked
to the current CDCS and other Missions’ basic crosscutting and strategic indicators, as long
as the Activity does interventions that can report on them. USAID/Colombia MEL Staff will
also provide the updated list of these indicators.
Note D: The following are examples of these types of indicators:
Value of mobilized funds (Ym)
Value of leveraged funds (Yl)
Number of Public-Private Partnerships created
Number of organizations (public, private, or CSO) supported or strengthened as a result of the USG assistance
Number of individual direct beneficiaries (or direct households benefited) supported by the intervention as a result of the
USG assistance
Note E: Codes should be used as part of the full name of the indicators in order to help proper
identification in all documents, charts, tracking systems and databases. Sometimes the name of two
or more indicators may look the same at glance. Therefore, including program’s and standardized
codes as part of the full name is a useful practice to reduce mistaking. Besides the sole numerating
6
Sex is the biological condition: Male / Female. In contrast, gender is a personal option.
7
Additional to those gender specific F indicators, there are other covering this topic which apparently do not have to see with it
but incorporate sex-disaggregates such as the indicators “EG.3.2-18 Number of hectares under improved technologies or
management practices with USG assistance” which ask for the amount of hectares managed by each respective category.
function of the code, it should also include a short acronym of the Activity as a prefix: for example
PMA for the “Producers to Markets Activity”. Afterwards it should be added a shortened “nomen”
corresponding to the main objective (component) under which the indicator makes part: for example
Ob1 for Objective 1 or MI for Market Integration; or CC for cross-cutting, etc. Finally, a numerating
integer should be added: for example 001, 012, etc. The following is an illustrative example:
PMA-MI-004 Number of firms receiving USG assistance that have obtained certification with an
international quality control institution meeting minimum product standards
The indicators selected should be presented in a table describing the relationship to each of
the objectives and sub-objectives proposed in the RF. It should also present the main
characteristics of the performance indicators selected (codes, types, source, frequency,
etc.). An example is illustrated in Table 1. The dis-aggregates, baselines, and targets of the
performance indicators should be also presented in a table. An example is illustrated in
Table 2.
Table 1. Summary Descriptive Table of Performance Indicators
#
Code
Indicator name
Class
Type
Units of
Measurement
Disaggregates
Source of
Data
Reporting
frequency
Main Goal: Improved competitiveness of rural agricultural producers
1
ACT-P-01
Number of households benefiting directly
from USG assistance (EG.3-1)
F Standard
Output
Households
Duration: New, Continuing
Location: Rural, Urban/peri-
urban
ACT
Smallholder
beneficiaries
Quarterly
2
ACT-P-02
Value of incremental net income (Custom)
Custom
Outcome
US$ Million
Commodity
ACT
Smallholder
beneficiaries
Annually
3
ACT-P-03
Value of targeted agricultural commodities
exported with USG assistance (EG.3.2-23)
F Standard
Outcome
US$ Million
Commodity
ACT
Supported
exporters
Annually
Component 1: Increased Market Integration
4
ACT-SP1-01
Value of smallholder incremental sales
generated with USG assistance (EG.3.2-19;
CDCS)
F Standard
Outcome
US$ Million
Commodity
ACT
beneficiaries
Annually
1.1: Increased Profile of Regional Products in Marketplaces
5
ACT-SP1-02
Number of firms receiving USG-funded
technical assistance to export (EG.2.2-1)
F Standard
Output
Firms
Duration: New, Continuing
ACT
Supported
firms
Annually
1.2: Increased Compliance with Market Standards and Regulations
6
ACT-SP1-03
Number of firms receiving USG assistance that
have obtained certification with (an)
international quality control institution(s) in
meeting minimum product standards (EG.2.2-
2)
F Standard
Outcome
Firms
None
ACT
Supported
firms
Quarterly
1.3: Expanded Market Integration
7
ACT-SPI-04
Number of firms receiving USG-funded
technical assistance for improving business
performance (EG.5.2-1)
F Standard
Output
Number of firms
Type of Firm: Formal, Informal
Duration: New, Continuing
ACT
Supported
firms
Quarterly
10
Table 2: Baseline and Targets Table of Performance Indicators
#
Code
Indicator name
Units
Baseline
date
Baseline
value
FY
2017
FY
2018
FY
2019
FY
2020
FY
2021
FY
2022
LOA
Main Goal: Improved competitiveness of rural agricultural producers
1
ACT-P-01
Number of households benefiting directly from
USG assistance (EG.3-1)
Households
09/2016
321
7,500
18,750
26,25
0
11,250
11,250
0
75,000
2
ACT-P-02
Value of incremental net income (Custom)
US$ Million
05/2016
0
TBD
TBD
TBD
TBD
TBD
TBD
TBD
3
ACT-P-03
Value of targeted agricultural
commodities exported with USG
assistance (EG.3.2-23)
Acai
US$ Million
N/a
0
0.00
2.6
2.2
7.7
5.2
5.5
0.8
Tea
US$ Million
N/a
0
0.00
5.3
6.8
0.5
7.1
7.8
2.8
Cocoa
US$ Million
N/a
0
0.00
0.3
6.2
2.8
4.7
3.3
4.1
Palm heart
US$ Million
N/a
0
0.00
2.5
0.9
8.1
0.5
9.0
9.6
Banana
US$ Million
N/a
0
0.00
5.0
7.8
1.2
5.7
9.1
7.6
Component 1: Increased Market Integration
4
ACT-SP1-
01
Value of smallholder incremental
sales generated with USG
assistance (EG.3.2-19; CDCS)
Acai
US$ Million
N/a
N/a
0.00
0.47
1.71
5.31
9.49
0.00
9.49
Tea
US$ Million
N/a
N/a
0.00
0.88
3.17
9.87
17.62
0.00
17.62
Cocoa
US$ Million
N/a
N/a
0.00
0.48
1.74
5.42
9.68
0.00
9.68
Palm heart
US$ Million
N/a
N/a
0.00
0.08
0.29
0.91
1.63
0.00
1.63
Banana
US$ Million
N/a
N/a
0.00
TBD
TBD
TBD
TBD
TBD
TBD
1.1: Increased Profile of Regional Products in Marketplaces
5
ACT-SP1-
02
Number of firms receiving USG-funded technical
assistance to export (EG.2.2-1) -Cumulative
Firms
08/2016
2
22
85
123
123
160
166
165
11
For each of the performance indicator selected in the AMELP, Performance Indicator
References Sheets (PIRS) must be constructed. These are charts describing in detail the
most relevant structural and methodological features to collect, validate and use data of
performance indicators. Because charts altogether may occupy a considerable space in the
plan, all PIRS should be placed in a separate section at the end of the AMELP (see last
section PIRS and CIRS).
The template and guidance of the PIRS will be provided by USAID as an attachment to this
document. Table 3 shows and illustrative example of the PIRS for a particular performance
indicator.
Table 3: Example PIRS
PERFORMANCE INDICATOR REFERENCE SHEET (PIRS)
Name of Indicator: CNG-003 Value of smallholder incremental sales generated with USG assistance (EG.3.2-19)
Name of Development Objective: DO3 Improved conditions for Inclusive Rural Economic Growth
Name of Intermediate Result: IR3.2 Increased public and private investment in the rural sector
Name of Sub-Intermediate Result: Sub IR 3.2.2 Increased private sector investment in target rural; communities
Indicator Class/Type: Standard (SPSD) / Outcome
DESCRIPTION
Precise Definition(s):
This indicator collects both volume (in metric tons) and value (in US dollars) of sales of targeted commodities from smallholder
direct beneficiaries for its calculation. This includes all sales by the smallholder direct beneficiaries of the targeted
commodities), not just farm-gate sales. Only count sales in the reporting year that are attributable to the Activity investment, i.e.
where the program assisted the individual farmer directly. Examples of the assistance include facilitating access to improved
seeds, other inputs, extension services, markets and other activities that benefited small holders.
The value of incremental sales measures the value (in USD) of the total amount of targeted agricultural products sold by small-
holder direct beneficiaries relative to a base year and is calculated as the total value of sales of a product (crop, animal, or fish)
during the reporting year minus the total value of sales in the base year.
The number of direct beneficiaries of the Activity often increases over time as the activity rolls-out. Unless an activity has
identified all prospective direct beneficiaries at the time the baseline is established, the baseline sales value will only include
sales made by beneficiaries identified when the baseline is established during the first year of implementation. The baseline
sales value will not include the “baseline” sales made prior to their involvement in the Activity by beneficiaries added in
subsequent years. Thus the baseline sales value will underestimate total baseline sales of all beneficiaries, and consequently
overestimate incremental sales for reporting years when the beneficiary base has increased. To address this issue, the Activity
requires reporting the number of direct beneficiaries for each value chain commodity along with baseline and reporting year
sales. The Activity may use the baseline sales and baseline number of beneficiaries to establish average sales per beneficiary
at baseline. The number of beneficiaries multiplies the average sales per beneficiary in each reporting year to create an
adjusted baseline sales value. To accurately estimate out-year targets for incremental sales, targets for number of beneficiaries
are also required.
It is absolutely essential that a Baseline Year Sales data point be entered. The Value of Incremental Sales indicator value
cannot be calculated without a value for Baseline Year Sales. If data on the total value of sales of the value chain commodity
by direct beneficiaries prior to the Activity implementation started is not available, do not leave the baseline blank or enter ‘0’.
Use the earliest Reporting Year Sales actual as the Baseline Year Sales. This will cause some underestimation of the total
value of incremental sales achieved by the Feed the Future activity, but this is preferable to being unable to calculate
incremental sales at all.
Unit of Measure: Dollars of the United States of America (USD)
Data Type: Currency (no decimal places)
Disaggregated by:
Geography (custom): Department; municipality; hamlet vereda-.
Market (custom): local sales; exports.
Commodity (Standard): Animal products; other.
Justification/Utility for Indicator:
PLAN FOR DATA COLLECTION
Data Source: Implementing partners report this indicator with data collected via direct methods (for example employee
records) or cartographic methods. Methodology for data collection must be explicit in the project objectives.
Method of Data Collection and Construction:
Reporting Frequency: Annually
12
Individual(s) Responsible at USAID: Activity’s A/COR
Individual(s) Responsible at the Activity:
TARGETS AND BASELINE
Baseline value:
Date of Baseline:
Life of Activity Target (LOA):
Fiscal Year Targets:
FY2019:
FY2020:
FY2021:
FY2022:
Notes on baseline and targets: (Example: LOA target will be determined based on the assessment of the baseline)
DATA QUALITY
Known Data Limitations: sales accounted at farmer’s door and those at marketplace may be different; method of collection
should be consistent as much as possible.
Date of Previous Data Quality Assessment (DQA): N/A
Date of Future Data Quality Assessment (DQA): TBD
CHANGES TO INDICATOR
Changes to Indicator:
Other notes:
This PIRS was last updated in October 01- 2017 by Jane Doe (KFV MEL Specialist)
Note E: The name of F standard indicators as well as those used or created by USAID/Colombia
must not be changed. The classification under which the F indicators are named is the Standardized
Program Structure and Definitions (SPSD). Change and modifications to standard or official
definitions stated in PIRS should be notified and agreed with USAID; such changes or modifications
are possible as long as they do not avoid the quality standards, hence making unsuitable an official
indicator to aggregate and report outside the boundaries of the Activity.
2.3. Context monitoring
Context monitoring is the systematic collection of information about conditions and external
factors relevant to the implementation and performance of the Activity. This includes
information about local conditions that may directly affect implementation and performance
or external factors that may indirectly affect implementation and performance.
Guidance: This sub-section should describe the efforts that the Activity will undertake for
context monitoring. Based on external factors affecting the Activity’s Theory of Change and
RF described in the Overview section, context indicators should be selected to track the
most relevant assumptions and/or risks identified. The metrics selected may be analyzed in
order to form judgments about how actors and conditions outside the control of USAID may
affect the Activity. Some of them may also reveal the unintended consequences of the
Activity. At least one of the indicators selected should be climate-related, in order to comply
monitoring toward “Climate Risk Management” (see the Mandatory reference 201mal). The
selection of context indicators may take into account that the collection effort not surpasses
the utility of the information.
Context indicators do not have a baseline but a value of reference recorded in a specific
date. Context indicators do not have either a target, but a (optional) threshold or level at
which an alert for a managerial action is triggered.
The indicators selected should be presented in a table describing their main features (units
of measure, sources, frequency, etc.). An example is illustrated in Table 5.
13
Table 4: Summary of Context Indicators
Code
Indicator name
Units of
Measurement
Source of
Data
Reporting
Frequency
Reference
Value
Reference
Timeframe
Threshold
Value
ACT-Cx-01
Average monthly potential
evaporation
mm/month
IDEAM /
Inirida
Quarterly
120
Feb/2019
244
ACT-Cx-02
Number of homicides per 100,000
inhabitants
Number
CNP / DIJIN
Annually
11
Dec/2018
14
ACT-Cx-03
Value of public investment per capita
in social infrastructure at rural
municipalities
USD
HRT sub-
contractor
Annually
8,000
Sept/2017
5,000
For each of the performance indicator selected in the AMELP, Context Indicator
References Sheets (CIRS) must be constructed. Because charts altogether may occupy a
considerable space in the plan, all CIRS should be placed in a separate section at the end of
the AMELP (see last section PIRS and CIRS). The template and guidance of the CIRS will
be provided by USAID as an attachment to this document.
2.4. Complementary Monitoring Approaches
If needed, this plan may look beyond indicators, incorporating other monitoring approaches,
to complement the quantitative information provided by the metrics. This may entail
especially the systematic collection, analysis, and reporting of complementary qualitative
data.
On the other hand, if some of the interventions of the Activity develop in situations where
results are difficult to predict due to dynamic contexts, unclear cause-and-effect relationships
or a complex environment additional monitoring approaches can be used such complexity-
aware approaches. This may also apply when there is substantial uncertainty on expected
results or the causal model cannot be defined. For more reference of this method promoted
by USAID please refer to ADS 201sad, Complexity-Aware Monitoring Discussion Note.
3. EVALUATION PLAN
In general, while performance monitoring responses “what” results are achieved by an
Activity, the evaluation methods look for providing information about “why or how” those
results are achieved. According to USAID guidance and policies, evaluation is the
systematic collection and analysis of information about the characteristics of activities
conducted as a basis for judgments to improve effectiveness, and timed to inform decisions
about current and future programming. The purpose of evaluations is twofold: to ensure
accountability to stakeholders and to learn to improve development outcomes.
USAID categorizes evaluations as impact or performance evaluations depending on the
purpose, evaluation questions, and the corresponding design. These two main classes of
evaluation encompass most of the existing quantitative and semi-quantitative types of
evaluations. Impact Evaluations measure the change in a development outcome that is
attributable to a defined intervention. This type of evaluations are based on models of cause
and effect and require a credible and rigorously defined counterfactual to control for factors
other than the intervention that might account for the observed change. Performance
Evaluations encompass a broad range of evaluation methods. They often incorporate
beforeafter comparisons but generally lack a rigorously defined counterfactual.
Performance evaluations may address descriptive, normative, and/or cause-and-effect
questions. For more information regarding evaluations at USAID please refer to USAID
Evaluation Policy.
14
Note F: The following are the two types of Evaluations that can be developed for USAID Activities:
USAID-led evaluations: These are external evaluation commissioned directly by USAID (funded with
funds different from those of the Activity) at any time of the implementation either to comply with
USAID evaluation requirements or when new information arises indicating that an evaluation is
appropriate for accountability or learning purposes. USAID may select to conduct and/or commission
a baseline, midterm, final and/or ex-post evaluation on an Activity, either as an Activity-specific or as
part of a larger portfolio evaluation.
Internal evaluations: These are evaluations conducted or commissioned by an IP concerning their
own activity. Funding may be dedicated within an Activity design for implementing partners to engage
in an internal evaluation for institutional learning or accountability purposes.
Guidance: This section may include any plans for internal evaluations. It should include
the type of evaluation (performance or impact) to be developed, possible evaluation
questions, estimated budget, planned start date, and estimated completion date.
Although internal evaluations do not count toward USAID evaluation requirements, to the
extent possible, internal evaluations should follow the same USAID processes and
procedures as described in sections ADS 201.3.5.15 through ADS 201.3.5.18. Internal
evaluations are subject to the same evaluation report requirements and quality criteria as
external evaluations as described in ADS 201.3.5.17. For internal performance evaluations
USAID guidance on constructing evaluation questions will be helpful. Please refer to TIPS
for developing good evaluation questions (for performance evaluations).
USAID may initiate an external performance evaluation at any time of the implementation. In
this case, the Activity will collaborate with USAID and/or the external Evaluation Team as
required. Hence, the Activity MEL Plan should also include descriptions for ensuring that any
planned external USAID-led evaluations will have access to appropriate data collected by
the implementer, such as performance monitoring data. It is envisioned that the major
collaboration tasks would include but are not limited to providing valuable insights on the
Activity concept and its achievements and lessons learned; providing inputs/comments to
the evaluation scope of work and/or the draft report; providing support to validate the
data/information collected by the Evaluation Team; and sharing ideas on recommendations
for modifications of the Activity as appropriate. USAID through the AOR/COR will actively
engage with implementing partners of activities being evaluated to ensure partners and
external evaluators are collaborating productively. AORs/CORs should inform implementing
partners as early as possible if their activity will be evaluated and must share the draft and
final evaluation design completed by the evaluation team. AORs/CORs will also ensure
implementing partners are facilitating access to data, documentation, personnel, and key
stakeholders as appropriate for the completion of an external evaluation.
Note G: Do not include into this section any planed assessment, informal review or stocktaking either
internal (conducted by the Implementing Partner) or external (conducted by a third-party contracted
for the purpose). If these types of studies or plans are planned they may be included in the Learning
Section.
4. LEARNING PLAN
Learning consists in augmenting the intention and consciousness about development
interventions. Experience and lessons learned by the Activity should be constantly used to
adjust and improve the USAID implementation cycle (adaptive management). Although this
process might be incorporated already in the management of the Activity, the learning plan
should document it and make it more systematic.
15
Sources for learning include data from monitoring, portfolio reviews, findings of research,
evaluations, analyses conducted by USAID or third parties, knowledge gained from
experience, and other sources. These sources should be used to develop plans, implement
projects, manage adaptively, and contribute to USAID’s knowledge base in order to improve
development outcomes
Guidance: This section must include a learning plan based on the performance and context
monitoring and evaluation data collected and analyzed, as well as ad libitum observations
and experience on developing the intervention. It should include also any planned survey,
study, assessment, informal review, workshop, or stocktaking done for learning purposes.
The following are learning actions to consider when this section (as suggested per ADS
201):
Meeting with partner(s), sector experts, donors, and the host government to share
monitoring information as well as reviewing activity progress.
Analyzing performance and context monitoring information to inform management and
adapt interventions.
Analyzing performance monitoring indicator data to identify any gender gaps.
Identifying knowledge gaps in the theory of change and RF.
Identifying gaps on technical knowledge and identifying and implementing ways to fill
these gaps.
Planning for and engaging in regular opportunities for partners to reflect on progress,
such as partner meetings, portfolio reviews, and after-action reviews. These
opportunities may focus on challenges and successes in implementation to date,
changes in the operating environment or context that could affect programming,
opportunities to better collaborate or influence other actors, and/or other relevant topics.
Encouraging or requiring partners under a project to collaborate, where relevant.
Collaboration activities may include joint work planning, regular partner meetings that
facilitate knowledge sharing, and/or working groups organized along geographic or
technical lines. These activities require time and resources, and implementing partners
should have the appropriate resources separated for these initiatives.
Involving sub-implementing partners in the learning activities, such as portfolio reviews
or stocktaking efforts, as appropriate.
Using the knowledge and learning gained from implementation, opportunities to reflect
on performance, monitoring data, evaluations, knowledge about the context, and other
sources to adjust interventions and approaches as needed.
For more USAID guidance on learning, collaboration and adapting please review the USAID
guidance Incorporating CLA in Activity Management.
Note H: This section should take into account the “Communications Strategy” of the Activity. It may
take into consideration how the qualitative information and success stories not covered by the
indicators reported contribute to building experiences and lessons learned throughout the
implementation. Such information complements data from Performance and Context Monitoring. The
learning plan may identify mechanisms so such body of knowledge might be systematically
documented and shared to reflect about it.
16
5. MANAGING DATA SECTION
Guidance: This section should explain how it intends to manage data at all stages, from
collection to reporting. It will do a full description of how the data and information will be
collected, analyzed, and used, based on data quality standards. This section should include:
1) Data collection methods
2) Data quality procedures
3) Data storage procedures
4) The formats in which data will be held and shared
5) Data security protocols
6) Data protocols for analysis and use
If several organizations are jointly managing the Activity, this section of the AMELP should
touch on how data will be consistently handled across the partners to ensure the quality of
the aggregated data.
USAID/Colombia utilizes MONITOR, an on-line management information system, to track
mission-funded Activities. USAID IPs should provide, report and update performance
information into MONITOR (including, but not limited to, indicator data and performance
results). Among the descriptions to include in this section, are all the previous procedures
the IP will use for preparing the data to report in MONITOR.
The AMELP is not intended for issuing reporting. However the plan may establish some
tools and methods to report data under the Quarterly Performance Reports and Annual
Performance Reports of the Activity. The Performance Tracking Table (PTT) below (Table 6)
may be included in those periodical reports to present updated performance data.
Note I: The data contained in the PPT at quarterly reports should be the same displayed in
the USAID/Colombia web-based tracking platform “MONITOR.
This section must also include a preliminary identification of the Open Datasets to be
collected and submitted in USAID’s open data portal (www.usaid.gov/data) according to the
ADS-579
8
. If all the datasets cannot be identified to the moment of the AMELP approval, it
must include provisions on future collection and submission in compliance with references of
the ADS-579, such as incorporation of metadata and standards for geographic (consult
mandatory references 579mab and 579saa to ADS 579 regarding geo-location and how to
report it). Datasets submitted to the DDL
9
must also be accompanied by supporting
documentation, defining the fields within the Dataset and any categories or labels within the
Dataset that may require explanation to an individual not familiar with the data. This may be
accomplished through the submission of a codebook or data dictionary. When available,
scopes and methodologies, such as survey protocols and instruments used to collect and
analyze the data must also be submitted to the DDL, along with annotations to inform the
general public of any known data quality issues. Datasets must be accompanied by
metadata as required by Project Open Data and other metadata assigned by USAID. This
may be accomplished by following the instructions at www.usaid.gov/data.
This section should also mention that training-related information should be reported in
TraiNet.
8
USAID will provide a suggested template to document the metadata (data dictionary) to be attached to each respective
dataset.
9
DDL: USAID Development Data Library. It is a repository of open data sets from development interventions.
17
Table 5: Tracking Table for Reporting Performance Indicators
Code- Indicator Name Abbr.
Previous FY
Current Year (FY 19)
Activity Total Progress
Observations
FY 18
Actual
FY18
Target
Q1
Q2
Q3
Q4
FY
Actual
FY
Target
FY
Prog.
%
Total
Prog
LOA
Target
Total
Prog %
Act-003 # people benefitted social
programs (EG.3.5-11)
7,350
7,000
405
68
682
1,115
1,000
111,5%
120,966
45,000
268.80%
Spectacular
achievements
to 3
rd
quarter
were due to
unexpected
willingness of
communities.
Act-009 # CSO members supported
(Custom)
1,464
900
280
193
473
800
59.10%
9,198
3,000
306.60%
A boom of
CSO has
erupted out
the renewed
Bajo Cauca
region
Note J: Do not report anything to this table in the AMELP. This table should be used at performance reports. It is included in the AMELP for
reference. Any change in the PPT template here in the plan should be reflected at performance reports
18
6. ROLES AND RESPONSIBILITIES SECTION
Guidance: This section should provide clear and precise descriptions on the Activity’s
MEL structures, functions, and capabilities. It should mention which staff/position(s) will
have responsibilities for MEL at the different implementation levels and their
competencies. It should incorporate a plan for staff MEL capacity strengthening and
collaboration/partnership with MEL organization(s) to strengthen the MEL system or fill
gaps, etc.
Example of the Activity’s Monitoring, Evaluation and Learning (MEL) structures, functions,
and capabilities narratives:
Implementing the AMELP
At the start of the project, the MEL Specialist will be the person responsible for implementing the
AMELP. He/she will also be responsible for building the capacity of all program staff, reporting
and supervising general monitoring and evaluation approaches, practices, and tools. The MEL
Specialist also cooperates with the project on the technical level, ensuring that indicators and
information are reported as needed by USAID and the COR/AOR.
Information Management
With reference to ADS 203.3.3.1 g), the XX Activity supports the Missions efforts to maintain a
performance monitoring information system that holds performance indicator including data
collected by this AMELP.
Strong suggestion: Add here a diagram showing the flow of the information/data since the
collection in the field or in third-party sources to the performance reports, including other
management task such as internal data quality routines, storage, etc.
Reporting
The MEL Specialist is in charge of producing the MEL reports on time, and in a technically valid,
high quality, and policy-relevant manner, with the purpose of providing firm grounds for
management decisions. He/she is responsible for developing the protocols and standard
procedures to ensure that data is gathered in a technically sound manner, is consistent and can
be compared throughout the years. He/she must make judgments with respect to whether or not
data meets quality standards.
MEL Oversight
The Chief of Party (COP) will have responsibility for overseeing MEL, assuring that the work of
the MEL Specialist(s) meets overall project needs and responds to Mission requests for
information. Missions in high-visibility locations such as Colombia have frequent “data calls” and
information requests, so assuring that the Activity’s responses are policy and decision relevant is
an important role.
Home Office Support
The XX Activity receives technical support from the home office (HO) MEL staff. [Name of IM
awardee] has assigned a long-term M&E expert to the project. His/her most essential services or
responsibilities will be to ensure that high standards are maintained and activities are consistent
with best practices in the field. From project start-up, the HO provides specialized assistance in
finalizing the AMELP and offer specialized training to the MEL team, when needed.
19
This section should also provide a Performance Reporting Schedule (a Gantt-chart is
recommended) indicating the planed tasks, frequency, timeline, responsible persons,
etc., for performance monitoring. Discuss relevant database systems for MEL, including
databases, USAID’s TraiNet, etc. The section may also mention that the Activity will
cooperate with non-scheduled requests for specific MEL data updates.
Sample Performance Reporting Schedule narrative:
Activity will produce monthly and annual reports. Every third month or quarterly Activity will
collate and update the performance data and review this with the COR/AOR. This will provide
significant input to the quarterly update of the work plan schedule as activities planned for the
following quarter.
Activity will provide quarterly Performance Summaries for the Mission. At the end of the fiscal
year, the activity will submit an annual performance report including a compilation of the year’s
actual indicators results versus its corresponding targets as well as explanatory narratives. All
reports are presented in draft to the COR/AOR before final submission.
Note K: IPS should allocate enough financial and human resources for implementing all of what
is proposed in the AMELP.
7. ACTIVITY PIRS AND CIRS
Guidance: This section should include the PIRS and CIRS of all the indicators selected
for this plan.
8. ANNEXES
Guidance: Annex may include all detailed descriptions of topics and elements of the
AMELP such as data collection instruments for surveys, key informant interviews or
focus groups used along the implementation of the AMELP. It may include
methodological approaches and descriptions of data sources.