• Bangkok Christian Guest House (map)
  • Sala Daeng 2 Alley
  • Bangkok, Bangkok, 10500
  • Thailand

This 3 day training event provided a practical introduction to monitoring and evaluation, with a particular focus on language and education programmes. As an introduction, it was particularly aimed at those with little or no experience and outlined some basic tools and techniques that could then be further tested in participants own contexts.


EVENT SCHEDULE

Wednesday 25th March:
08:45-09:00 - Sign in and introductions.
09:00-09:30 - Mapping out the training. Where are we going?
09:30-10:30 - Why do M&E? What is it for?
10:30-11:00 - Break
11:00-12:30 - How does Monitoring and Evaluation fit with my program?
12:30-14:00 - Lunch
14:00-15:30 - Principles of good M&E frameworks.
15:30-16:00 - Break
16:00-17:30 - Working with complexity.
17:30-17:45 - Daily debrief

20:00-21:30 - meet the LEAD Asia team (TBC)

Thursday 26th March:
09:00-10:30 - Introduction to tools and quantitative indicators.
10:30-11:00 - Break
11:00-12:30 - Outcome mapping.
12:30-14:00 - Lunch
14:00-15:30 - Using participatory methods in M&E.
15:30-16:00 - Break
16:00-17:30 - Most significant change.
17:30-17:45 - Daily debrief

Friday 27th March:
09:00-10:30 - Introduction to evaluation.
10:30-11:00 - Break
11:00-12:30 - Practical considerations - designing an M&E system.
12:30-14:00 - Lunch
14:00-15:00 - Review your own current M&E system.
15:00-15:45 - Next steps, key resources.
15:45-16:15 - Closing tea


EVENT NARRATIVE

Day 1: Where are we going?

Day 1: Where are we going?

Follow us in Bangkok as we learn about core principles in Monitoring & Evaluation

  1. Phil kicks us off on our learning around monitoring and evaluation. Where are we going? #LEADCoP http://t.co/Rq0DmTylPY
    Phil kicks us off on our learning around monitoring and evaluation. Where are we going? #LEADCoP pic.twitter.com/Rq0DmTylPY
  2. Phil Smith, LEAD Asia's Associate Director for Programs, laid out a 'road map' of the event (3/25-3/27):
  3. Starting from the basics: What is M&E? > Indicators & data collection > Participatory M&E > Outcome Mapping for behavioral change > Most Significant Change > How can we commission and use evaluation in our projects?
  4. Kicking off the LEAD CoP event on M&E. To quote Bryant Myers, "How do we learn towards transformation?" http://t.co/2hQNl7apub
    Kicking off the LEAD CoP event on M&E. To quote Bryant Myers, "How do we learn towards transformation?" pic.twitter.com/2hQNl7apub
  5. Access a full schedule of the event here.
  6. Sharing Our Expectations of the Event
  7. One group from the Philippines shared that they were hoping to learn about "M&E in the context of MLE in the Philippines, where classrooms are highly multilingual...and where languages are viewed according to their market value."
  8. Defining M&E
  9. #LEADCoP discussing What is M&E and what reactions that invokes--what a mix of emotions -- ahhgggh!!!!
  10. Some strong emotions shared about M&E: from "pure agony..." to "exciting and empowering"! #LEADCoP http://t.co/lCFyLm54f7
    Some strong emotions shared about M&E: from "pure agony..." to "exciting and empowering"! #LEADCoP pic.twitter.com/lCFyLm54f7
  11. Participants shared their emotions about M&E, both negative (it's 'extra work', 'boring', 'a stressful 'test'', 'too quantitative and formal', even 'pure agony'!) and positive ('exciting', 'empowering', and an opportunity for 'learning, adapting, adjusting, and celebrating').
  12. M&E asks 4 simple questions: What happened? > Why? > So what? > Now what?
  13. Phil: Monitoring is like a snake, slithering around your programme; Evaluation is like a frog, jumping in and out #LEADCoP
  14. The snake and frog analogy for monitoring and evaluation is winner! #LEADCoP http://t.co/hE49jaeSJ7
    The snake and frog analogy for monitoring and evaluation is winner! #LEADCoP pic.twitter.com/hE49jaeSJ7
  15. Monitoring: "Are we getting any juice?" VS. Evaluation: "Was the juice worth the squeeze?"
  16. Evaluation typically looks for the 'Big 5' (+ 1): Relevance, Efficiency, Effectiveness, Sustainability, Impact (and Equity).
  17. If you don't know where you're going, then any road will 'get you there'...
  18. Matt Wisbey, LEAD Asia Resourcing & Communications Coordinator: "In complex, multi-actor situations, we can look for areas where we're contributing to a change, not just where we're responsible for a change."
  19. Matt: Thinking about the monitoring opportunities in our program plans #LEADCoP http://t.co/TmpBfPTfxH
    Matt: Thinking about the monitoring opportunities in our program plans #LEADCoP pic.twitter.com/TmpBfPTfxH
  20. Can we ever take credit for change? What does attribution mean for your work? #LEADCoP http://t.co/NAt7Qskyhb
    Can we ever take credit for change? What does attribution mean for your work? #LEADCoP pic.twitter.com/NAt7Qskyhb
  21. Teams from the Philippines, India, Indonesia, and Malaysia practice putting together M&E frameworks http://t.co/IH560niayz
    Teams from the Philippines, India, Indonesia, and Malaysia practice putting together M&E frameworks pic.twitter.com/IH560niayz
  22. Great to have Sharon leading a session on River of Life today, based on her experiences in Cambodia. #LEADCoP http://t.co/bct2yiOH1f
    Great to have Sharon leading a session on River of Life today, based on her experiences in Cambodia. #LEADCoP pic.twitter.com/bct2yiOH1f
  23. The River of Life tool is a quick, easy tool that allows teams to reflect on their project's past, present, and future.
  24. How it can be used in M&E: After writing down the project's key activities, teams can compare these activities to the activities in their log frames or RBM plans.
  25. See step-by-step instructions for the River of Life (and other great tools!) here.
  26. Principles of Good M&E Frameworks
  27. Foundational principle #1: Does your M&E system cover all levels?
  28. What are the different levels? 1) Inputs: Our resources, like HR/finance systems, 2) Outputs: What we are producing, 3) Outcomes: The direct effect of our work, 4) Impact: Are communities changing for the better? 5) Context, 6) Organizational health: How we are working with each other)?
  29. Expected & Unexpected Results
  30. The Monkey Business Illusion
  31. Remember..."when you're looking for a gorilla, you often miss other unexpected events!"
  32. Phil shared the story of Pu Tru, a village in northeast Cambodia. By all of his organization's measures, the project was failing, but when villagers were interviewed, they surprisingly attributed many of the positive changes in the community to the project.
  33. In another case, attendance at the literacy class was declining, and this was seen as negative. But in reality, the literacy class had given students enough skills and confidence to re-enroll in the formal government school, which was positive!
  34. Think: Are we dealing with systems that are simple, complicated, complex, or chaotic?
  35. According to researcher Dave Snowden, in simple systems (e.g. measuring our outputs), we should respond with best practice. In complicated systems (e.g. measuring test scores), we respond with good practice or expert analysis. In complex systems (e.g. measuring changing social attitudes in communities), we respond with emergent practice. And in chaotic systems, we respond with novel practice. The difficult part is figuring out which system we are in!
  36. We in development often work in complicated, complex, or chaotic systems...
  37. ...so: How complex are the things we are measuring? Are we using the appropriate tools?
Day 2: M&E Tools

Day 2: M&E Tools

An introduction to monitoring tools we can use at different levels in our programs.

  1. Note: For all presentations, please join the LEAD Community of Practice on Ning. Access resources here.
  2. Preparing for day 2 of the #LEADCoP on monitoring and evaluation. Starting today with the intricate issue of indicators!
  3. Yesterday we looked at the 'Why?' of M&E: principles and theories.
  4. Today we will look at tools that we can apply more directly in our programs.
  5. 1. Indicators
  6. Definition: A specific thing we can measure or observe which tells us about whether our goal is being achieved. These can be direct or proxy (indirect), and should be Specific, Measurable, Attainable, Relevant, and Time-bound (SMART).
  7. What we need: 1) A data-collection method & plan, 2) A process to analyze the data, and 3) A baseline and a schedule for when to follow up.
  8. Example: "Proportion of students who, by the end of two grades of primary schooling, demonstrate that they can read and understand the meaning of grade-level text." (USAID)
  9. What to remember: These are much easier to develop at the lower levels (Activities, Outputs). Once you get to the Outcome level, it's difficult to get specific measures that are meaningful. Indicators deal with simple or complicated systems; after all, we have to already know what we want to measure!
  10. Elaine sharing about the need for reliable, valid, and timely data #LEADCoP http://t.co/l9M6CwTdrE
    Elaine sharing about the need for reliable, valid, and timely data #LEADCoP pic.twitter.com/l9M6CwTdrE
  11. Asking the right questions is just as important as asking questions, as Elaine Vitikainen, SIL MSEAG OrgDev Specialist, shared.
  12. When collecting data that is subjective (e.g. stories), we can try to triangulate a more objective perspective by asking at least three different sources about the same subject or using at least three different methodologies.
  13. 2. Using Participatory Methods in M&E
  14. Definition: Tracking change together with communities. It's a means to help communities reflect on and analyze their challenges.
  15. Example: Participants practiced using a force-field analysis to monitor progress toward the goal of the CoP event: "M&E Event participants have increased awareness of monitoring and evaluation approaches AND are growing in confidence to learn more so they can use the approaches...."
  16. This also helped LEAD Asia recognize what was going well and what could be improved: monitoring the CoP event itself!
  17. 3. Outcome Mapping
  18. Definition: The Outcome Mapping approach is an alternative to the Logical Framework Approach or to Results-Based Management. It focuses on behavioral changes in those we are trying to influence in our programs. It can help us with 'intermediate outcomes' and behavioral changes, and deals with complex or complicated systems.
  19. Example: In seeking to improve education for children, we need to influence teachers, parents, local authorities, and school management. In Outcome Mapping, we would track the changes in the behavior of these groups. For instance, we would track whether teachers are using child-friendly methods or whether parents are involved in monitoring the schools.
  20. Using Outcome Mapping to track behavioral change: Phil shares an example from Cambodia #LEADCoP pic.twitter.com/XbQFRvNDzv
  21. See this useful guide to Outcome Mapping for more information.
  22. 4. Story-Based Approaches (Most Significant Change)
  23. Wrapping up the second day of the CoP by exploring story-based approaches to M&E #LEADCoP http://t.co/av5M37xtST
    Wrapping up the second day of the CoP by exploring story-based approaches to M&E #LEADCoP pic.twitter.com/av5M37xtST
  24. Story-based approaches help us track the wider Context, our Outcomes, and a bit of our Impact. These deal with complex or complicated systems.
  25. Definition: In Most Significant Change, we collect stories of change from the community and select the story that represents the 'most significant change' in the community. This selection process both provides a means to reduce large amounts of qualitative data and provides an important forum for dialogue about project values.
  26. Process: 1) Collect stories, 2) Select the story with the most significant change, 3) Document and share the reasons for its significance.
  27. Example: "Looking back over the last 6 months, what do you think was the most significant change in [ ] as a result of your involvement with [ ] project?"
  28. See how Xinia Skoropinski, SIL Philippines Associate Director for LEAD, used the Most Significant Change tool to evaluate an MTB-MLE program for Save the Children here.
  29. That's all for the second day of the LEAD CoP on M&E! Storifying the Storify experience...
  30. Sharon working hard to share all our learning with the rest of the world! #LEADCoP http://t.co/UrWC4STAoZ
    Sharon working hard to share all our learning with the rest of the world! #LEADCoP pic.twitter.com/UrWC4STAoZ
Day 3: Intro to Evaluation

Day 3: Intro to Evaluation

The group hears an overview of evaluation and works on their M&E frameworks.

  1. Again, please find all resources from this CoP event here on Ning.
  2. Final day of the #LEADCoP training event. People seem a bit more quiet today. Have we answered all the questions or are they overwhelmed?!
  3. Phil started the group off with a thought-provoking question:
  4. "Evaluation is the subjective interpretation of data collected during the monitoring of a project." True or false?
  5. What's involved in an evaluation? Here's an interview with two participants who have been involved with evaluation, Anne Thomas (independent consultant) and Chhejap Lhomi (NELHOS):
  6. The Evaluator: Anne Thomas has been involved with projects in Papua New Guinea, Laos, and Cambodia, and her focus is on literacy and community development. She does one or two evaluations a year, advising literacy programs. She is familiar with the grassroots level and can speak the language in some cases.
  7. Q: What is your role in evaluation?
  8. Anne: Many people have strong negative reactions towards M&E, because the funder may have commissioned an evaluation. So instead we say, "We're going to look at your projects and see how they went and what we should improve for next year's planning." We want our evaluation reports used, so we use a capacity-building, participatory approach.
  9. We evaluate projects on an operational (efficiency/effectiveness) level, strategic (relevance/impact/sustainability) level, and in terms of cross-cutting issues (gender parity).
  10. Q: How do you prepare for an evaluation?
  11. Anne: I go and meet in person the people who want the evaluation. It's very often requested by the funder, and the person who's a team leader might not even know the project very well; they're in the capital city and from the majority culture, and the project is in a remote area.
  12. Anne is transparent about what she's going to evaluate and discusses what she's going to evaluate in advance with the staff to avoid the issue of "We didn't ask you to evaluate that!"
  13. Q: What are some challenges you face as an evaluator?
  14. Anne: The challenge is the people who are removed from the village, who don't speak the local language and say, "This is your subjective opinion. We have a great project." If you have good results, they like your report. As an evaluator, I cannot always give you an A+. If you find things which need to be improved, you might meet resistance. If you have a project with staff who don't speak the local language, that needs to be changed.
  15. Q: Do you have an recommendations for organizations that are planning for an evaluation?
  16. Anne: If you don't want to have an evaluation report sitting dusty on the shelf, agree to spend the time and effort to have a participatory evaluation, which is capacity building for the staff, and where it's just a logical next step for the staff to implement the recommendations. It's a mixture of internal and external evaluation--you may need to sell the funder on this.
  17. The Evaluated: Chhejap Lhomi works for the Nepal Llhomi Society (NELHOS), based in Kathmandu. It was founded by the Lhomi people for indigenous communities in Nepal. Their last evaluation was in 2012.
  18. Q: What preparation was needed? Was the evaluation commissioned by a funder?
  19. Chhejap: The donor gathered all the stakeholders and conducted the evaluation. We need to know our project plan well. We need to have documentation of data quarterly and yearly, and we need to take care of the logistical arrangements for the evaluators.
  20. Q: What were some of the challenges you faced?
  21. Chhejap: It was difficult to find the data, because the data was in different folders. The board and the staff need to know why we are doing the evaluation and the focus of it. And what our next plan is.
  22. Q: Was the evaluation helpful?
  23. Chhejap: It helped show how we should train our staff, and what should be done on time. Some points were encouraging, other points showed our weaknesses, and this gave us a chance to improve. A challenge was that we had done good work in the field, but we could not show what actually happened.
  24. Elaine: Remember, the evaluation can be like a small project in itself: there are outcomes that need to be achieved and activities that need to be done and organizational systems that need to be in place. It's not about being judged, it's about working together.
  25. Impact Evaluations
  26. Phil: There are four times to do an evaluation: baseline, mid-term, end, and post-evaluation (evaluating sustainability).
  27. For more information on conducting impact evaluations, especially evaluations relevant for small-scale programs, see 3ieimpact.org and povertyactionlab.org. One helpful paper can be found here. Other sites related to evaluation include the American Evaluation Association and the Canadian Evaluation Society.
  28. so loving Google drive! Great world cafe sessions today. #LEADCoP
  29. More resources about Terms of References (ToRs) are available on the Better Evaluation website. Also take a look at this 'how-to guide' on Writing ToRs for an Evaluation.
  30. What next? Developing plans for action before going our separate ways #LEADCoP http://t.co/YJnfMSqGPh
    What next? Developing plans for action before going our separate ways #LEADCoP pic.twitter.com/YJnfMSqGPh
  31. Onto bigger and better things! Thanks to everyone who attended the #LEADCoP on M&E http://t.co/l5IiRodR05
    Onto bigger and better things! Thanks to everyone who attended the #LEADCoP on M&E pic.twitter.com/l5IiRodR05