Systems thinking for evaluation once more!

Yesterday I attended the live webinar on ‘Systems Thinking for Equity-focused Evaluation’ organized by UNICEF, UNWOMEN, the Rockefeller Foundation and a bunch of other organisations and institutes that together formed MY M&E (a really awesome and informative website with all you ever wanted to know about

monitoring and evaluation).  

Yesterday’s seminar is just one of the very many webinars they organize on evaluation (there are series of webinars on equity focussed evaluations, on emerging practices in development evaluation, on developing capacities for country-led M&E systems, on country-led M&E systems, etc). Every other week or so you can attend -for free- lectures delivered by top-notch evaluators and methodologists like Michael Quinn Patton, Patricia Rogers, Bob Williams, Martin Reynolds, etc., and theoretically debate with them!  Yes, we live in a world of wonders!

Yesterday both Bob Willliams and Martin Reynolds, both reknowned system thinkers/ evaluators, gave short introductions on ‘Systems thinking for Equity-focused Evaluations’ for a global interactive classroom of nearly 100 participants.  Bob Williams briefly explained the key principles of ‘thinking systematically’: Inter-relationships, perspectives and boundaries. These are the three principles  many methods from the ‘system field’ have in common (Williams claimed there were about 1200-1300 methods in the system field!). Martin Reynolds dived into the cross-road between equity focused evaluation and one of the system methods: Critical Systems Heuristics (CSH).

Although Martin Reynolds presentation looked rather impressive, the complexity of his story combined with some technical disturbances, made his lecture hard to follow and understand. There was one topic though that strongly made my ears wide open! He was talking about steps in  CSH, starting with a ‘Making an Ideal Mapping of ‘Ought’ (sounds like a fairytale), followed by a Descriptive mapping comparing ‘Is’ with ‘Ought’, and other steps. This ‘Ideal Mapping of ‘Ought” is placed at the beginning of the whole exercise to provoke ‘blue sky thinking’ and letting people realize that reality is constructed, and can be re-constructed if we really want.

Why is this remark raising my interest? Well, if you have followed my earlier blogs, my queste is very much ‘How can evaluation contribute to re-construction? Or with other words, how could evaluation contribute to ‘system change”.  Bob Williams commented on Reynolds saying  that it made him think very much of organisational development and ‘vision’ building, and that is certainly true as well.

And all that brings me again to my eternal question: ‘How does system thinking contribute to evaluation practice?’ Are it the new clothes of the emperor, or can it really contribute something solid? Again, I come to the conclusion it is not so much about the tools and instruments from the systems field itself, but about the way of thinking. Think big, act small, and see our world as one big construction site, taking nothing for granted, and challenge the existing rules of the game. Let evaluation (either with or without system thinking) help us in contributing to the transformation of this world! 

Next week, 22 november Particia Rogers will provide a lecture and on 6 December 2011 it is Michael Quinn Patton’s turn! You are strongly advised to join!

Posted in System thinking | Tagged , , , , , | 1 Comment

Maria Joao Pires & Evaluation Practice

What has Maria Joao Pires -the reknowed pianist- to with evaluation practice? Well, in first instance for most people likely nothing, but in my reality, or better,  in my brains Maria makes a great connection with evaluation.

It is already quite some years ago that I came across a documentary on Maria Joao Pires. In this documentary you see her students struggle with some of the most complex piano pieces, intertwined with shots of the gorgeous surroundings of her farm in Portugal. Although I am not a connaisseur, I guess the students played – technically- superb and showed great virtuosity!

Despite their virtuosity, Maria was -most of the time- not impressed. I do not recall exactly what she said, but it was very much in the line of ‘Yes, technically you played the piece very well, but tell me, why should you play this piece? What did you add to this piece? How did you interpret it? I want you to put your soul into this piece! Otherwise the piece could be played by anybody else. What makes your piece different from the (same) piece student X is playing?’ (after writing I found some clips on Youtube, aughh, memory is a feable thing; anyway for the point of this blog is does not make much difference ;-)).

I am not saying that evaluators are piano players, but Maria has a point here, also for evaluators. As evaluators we need to have expertise (knowledge, technical, procedural and intellectual) as a ground rule. Without this expertise we are nowhere and not worth to be hired anyway. The question here is, is that enough? If we are virtuose in our expertise, does that suffice to be a ‘good’ evaluator?

During a diner gathering with other evaluators (organised by Evaluation 5.0), we discussed -part of- this topic.  A first additional qualification that good evaluators (in our view)  should have, we concluded, is proper behaviour. The outcomes of an evaluation are influenced by many different factors, but one we have a certain control over is our own behaviour. When we are directive, the evaluated very likely will be defensive or timid. When we are open, and are truly listening, the evaluated may be open too and share his or her mind.

But still, does this qualify us as good evaluators? Not necessarily. So do we need to put our soul into our work, just like the pianoplayers should according to Maria? I am not quite sure about that. But what we do need to do, is to be aware of our vision and motivation. What is it we are actually doing? Are we mainly earning money? Or do we want to contribute to a more just and sustainable world through our practice? Shouldn’t we first clarify our vision, and use our expertise and behaviour to contribute to that vision?

Not that I have my vision ready, but my, I could start trying and ask myself ‘why should I do this evaluation and not somebody else?’.

Posted in Methods, Roles | Leave a comment

Reading Michael Quinn Patton

Since I am writing an article on development cooperation and its M&E approaches, -and naturally to keep myself updated- I read Michael Quinn Patton’s latest book (2011) ‘Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use’, published by the Guilford Press, New York.

To avoid any confusion: Developmental evaluation has nothing to do in particular with development cooperation. ‘Developmental’ is referring to the approach that Patton’s follows. He writes (based on a quote of Pagels) “Evaluation has explored merit and worth, processes and outcomes, formative and summative evaluation; we have a good sense of the lay of the land. The great unexplored frontier is evaluation under conditions of complexity. Developmental evaluation explores that frontier.” (pg 1)

So Developmental evaluation is an evaluation approach dealing with complexity. However, various practitioners and evaluation professionals have start using Developmental evaluation within development cooperation.

The book is a very good read, with illustrative examples and hilaric and anecdotal situations. Patton describes for example how he has come up with Developmental evaluation.  He was working for a programme, using formative and summative evaluations as his repertoire, while the team did not want to come to a fixed model (summative evaluation) that could be tested during a summative evaluation. “We want to keep developing and changing”, they stated….. “Formative evaluation! Summative evaluation! Is that all you evaluators have to offer?”, one of the team members exclaimed. ‘Frustration, even hostility, was palpable in his tone.’……. “Well,” I said, seeking inspiration in my coffee cup, “I suppose we could do, umm, we could, umm, well, we might do, you know… we could try developmental evaluation!” (pg 3)

Developmental evaluation supports innovation development to guide adaptation to emergent and dynamic realities in complex environments, so it is quite different from regular evaluation approaches that focus more on control, and finding order in the chaos. Patton mentions five complex situations developmental evaluation is particularly appropriate for:

  1. Ongoing development in adapting a program, policy, or innovation to new conditions in complex dynamic systems;
  2. Adapting effective principles to a local context as ideas and innovations are taken from elsewhere and developed in a new setting;
  3. Developing a rapid response in the face of a sudden major change, exploring real time solutions;
  4. Preformative development of potentially broad impact scalable innovation;
  5. Major system change and cross-scale development evaluation.

A very important key feature of developmental evaluation is that it aims to contribute to social change and ‘nurture developmental, emergent, innovative, and transformative processes’. It is not so much about testing and refining a model (formative) or about a judgement (summative). It has a strong action-research component. With this, he is embarking on a rather new purpose of evaluation. Ofcourse, other types of evaluation aim to contribute to social change as well, but usually in an indirect way, exploring what works and what doesn’t. Developmental evaluation goes a step further, and aims to be part of the action, facilitating interventions that may work (or not).

Another, very much related key feature is the ‘closeness’ of the evaluator to a programme. From a person that is only visiting mid-term or at the end of a programme, a developmental evaluator is ‘continously’ present. Asking questions, probing, exploring with the programme, providing feedback in ‘real time’ in rather short feedback loops.

These two features are in my opinion, exactly what may be needed when dealing with complex situations. The situations are complex, unpredictable, multi-causal, non–linear, emergent and may need constant attention. Programme or project leaders (in my experience) are many times too involved in their management activities to also be able to remain reflective and ask critical questions themselves.  A developmental evaluator could provide help.

Overall, it is an inspiring and thought provoking book, and offers good guidance, without falling in the pittfall of blueprints or steps! Ofcourse it also raises questions. Especially when he is talking about system change, the fifth complex situation. Here he refers to the work of Bob Williams who uses a quite broad understanding of system as long as boundaries, perspectives and interrelationships are involved. In the end this means that almost all situations are ‘systems’ and that is what I see happening in debates.

I think (and correct me if I am wrong) the ‘system’ concept needs unraveling and  ‘demystification’. What is really necessary  is to challenge the institutional settings and its related norms, values, cultures etc that reproduce current unsustainable practices.   What could help this unraveling  is to borrow from concepts and theory used in innovation science.

In my article on development cooperation and M&E appproaches, this will be one of the topics I will further explore and discuss. It is going to be an inspiring and hot summer! I hope to write more about this topic in my next blog.

Posted in Methods, Roles, System thinking | Tagged , , , | Leave a comment

Sector learning at IRC: Harvesting lessons and facilitating sector learning

From 5-7 April 2011 Barbara (Wageningen University/ Communication and Innovation Studies) and I conducted a workshop on sector learning for the IRC International Water and Sanitation Centre.

During the workshop we tried -with the help of 5 coordinators of resource centre networks in Nepal, Honduras, Ghana, Uganda and Burkina Faso and their IRC counterparts- to unravell the concept of sector learning through using ‘visioning’ and ‘theory of change’ exercises.

Sector learning is a concept that has been used by IRC for some time now. Nevertheless it seemed to mean something different for almost all actors involved in sector learning. One of the participants compared the unravelling of the concept to peeling off the layers of an onion (and wrote an inspiring blog on it). While peeling of the layers you slowly discover that the layers are not the onion, but are just part of it (and yes, it makes you cry ;-)). However, we managed to take a few steps in the unravelling process.

Then we continued with developing a framework for harvesting and sharing lessons on sector learning, and introduced some of the principles and tools we use for Reflexive Monitoring in Action.

Based on the concepts, the theories of change, the framework and the principles and tools of RMA, the countries made their plans for joint reflection and learning. They included dairies, dynamic learning agendas, timelines, eye-openers etc in their plans, no lack of ambition there!

An interesting set of blogs was produced during the workshop. There is no need for me to reproduce them here. Nothing better than blogs straight from the participants themselves. See

Although we developed RMA  for system innovation projects we feel that this approach could also be relevant to well-defined, policies, initiatives, and programmes that seek to bring about fundamental changes in systems, but do not frame their efforts in these terms (including development cooperation and natural resource management). With IRC we are taking first steps to explore how this approach could work as well. Do the principles of RMA and its tools contribute to Sector Learning as well?

Within one year we will know more! I can’t wait to see what happens.




Posted in Methods | Leave a comment

Systemic Approaches in Evaluation revisited

Somewhere I got lost in Germany, it seems. The GIZ Conference ‘Systemic Approaches in  Evaluation’ (25-26 January 2011) took me by surprise, and I still do not fully know what I should think about it. For instance, I did not get many satisfying answers on what the added value of systemic approaches in evaluation is. One of the reasons I think is because there was quite some confusion about what system thinking and systemic approaches are.

Another reason could be that system thinking was approached in a rather instrumental way to deal with complicatedand complex situations, not for enhancing learning necessarily.

Another factor that came through rather sharp is the evaluation system itself. GIZ was organising the Conference, and therefor acknowledging the complexity of development. At the same time the Conference was filled with (sometimes frustated) evaluators working for GIZ that struggled to get learning and ownership by local NGOs included in the rather accountability focussed evaluations.

On the other hand: I met loads of interesting people, and attended a very inspiring presentation on Developmental Evaluation, among others by Richard Wilson-Grau. Michael Quinn Patton developed this approach over the last decades. Developmental Evaluation is meant to enhance learning and innovation. Richard Wilson Grau explained Developmental Evaluation as follows using the metaphor of a chef preparing food. “When the chef is, while preparing the soup, tasting and adjusting it, you could see this as formative evaluation, when the guests are eating the soup, you could see this as summative evaluation. For developmental evaluation though, the chef and the guests prepare the soup together in a co-creative way.” Here I absolutely see added value of using system and complexity thinking: You use it for learning and further enhancing social change. Currently I am reading Patton’s new book ‘Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use (2011), published by the Guilford Press. Absolutely recommendable!

Oh yes, and our presentation on Reflexive Monitoring in Action was well attended, and provoked lots of good questions. Many questions focussed on ‘conflicts’; they had expected many more conflicts within the networks. However, in our networks we highlighted  ‘what issue do we have in common’? , and worked from that shared ground…

Posted in Methods, System thinking | Tagged , , , , , | Leave a comment

On Complexity and System Thinking in Evaluation

Next week, on 25-26 January 2011 the GTZ Conference on Systemic Approaches in Evaluation will take place. Since we (Barbara/ WUR and I) will present our Reflexive Monitoring in Action approach there, Rosien, one of my Evaluation 5.0 colleagues asked : ‘Is this a counter movement against the Complexity Guru’s?‘. ‘I think’, said Bob, also an Evaluation 5.0 colleague, ‘you could see the complexity discussion as a branch of the systemic approaches.’ ‘Well’, I said, ‘I actually think, that the system thinkers are a branch of the complexity thinkers.’Mmm’, Bob replied, ‘this is an issue we could probe a bit further into.’ And so I do.

I started reading Bob Williams , a reknowned thinker on ‘Using Systems Concepts in Evaluation’. In the FASID document ‘Issues and Prospects of Evaluations for International Development’ (2010) he says something interesting: “The history of the systems field is (…) rooted in addressing complicated and complex problems with limited time and with restricted resources.”(pg 37) In my own words, it is a way of ‘dealing’ within M&E approaches with complicated and complex problems; looking glasses that help you to make sense of a situation. Mmm, can I think of any other M&E approaches dealing with complicated and complex problems? Oh yes, many constructivist methods deal with complicated and complex problems. So then, what does system thinking add that other approaches do not?

Williams also states in his new book “Systems Concepts in Action. A Practitioner’s Toolkit.” (2010, Stanford Business Books) that the principles of system thinking can be expressed by three concepts: The concepts of inter-relationships, perspectives and boundaries. I admit system thinking can give you a broader outlook on situations, because you do see developments in (multi-causal/ non-lineair) relation to each other, and you see multiple perspectives.  But that is something a good Theory of Change exercise could do as well…. What then does system thinking really add? I cannot really find an answer in Williams’ writings.

In my own practice I definitely see what the added value is of certain kinds of system thinking like Reflexive Monitoring in Action: Facilitating system learning, questioning certain systemic barriers (e.g. laws and regulations, market structures, norms and values), or reframing them into opportunities, questioning each others practices and maybe even more importantly, questioning your own practices.

I am curious what other practitioners see as added value of system thinking! After the conference I will get back to this fascinating issue!

Posted in Methods, System thinking | Tagged , , , , , | Leave a comment

Most significant events of the year

Every year between Christmas and New Year I reflect upon the past year and start planning for the new year to come; what went well, what needs to change and what new ideas need to be developed? Actually I do some kind of  ‘Most significant change’ exercise for my company, or in this case most significant event exercise since I have experienced only one major change over 2010: My new branch in city farming! Ofcourse I have refined some of my products, but 2010 mainly consisted of events, good events, finishing up projects, starting new ones etc. Some of the highlights:

• Release of the Reflexive Monitoring in Action guide in both english and dutch with the Wageningen University and the Free University Amsterdam.

• Publication of our article “Enhancing the Reflexivity of System Innovation Projects with System Analyses” in the American Journal of Evaluation, with Barbara van Mierlo and Cees Leeuwis. It took some time, but we did it!

• Coaching of the evaluation of Kulturhus-concept (in dutch only) (100811_Beleidsnotitie_Most_Significant_Change), using the Most Significant Change Method. It was a joy to work with such enthusiastic and energetic, young professionals.

•Participating in the Conference “Evaluation Revisited” organised by the CDI and others. It was there I fell in love again with my profession.

• Developing the package of e-Valuation (an evaluation package in which use is made of both qualitative and quantative methods and visual and written data and communication) with Dominique Darmon, a professional producer. Brochure: Brochure e-Valuation

• Developing with Rosien Herweijer and Bob van der Winden Evaluation 5.0. A collective of evaluators that strives for new types of evaluation that are genuine collective, involving all actors (including donors), that questions all power issues, and makes use of new media and images.

• And then, I started a new path, something completely different: Exploring city farming, organic and CO2 low food production. It has nothing to do with monitoring and evaluation, but with my higher goal: Contributing to a more sustainable world. After all the head work, I longed for something practical, with very tangible results: good and fresh vegetables. It are first steps only, but in 5 years’ time I want to have established a self sufficient vegetable collective with regional products.

And naturally there are the issues that need attention: I had promised myself to work more with video and images over the year 2010, but that did not materialise. It means I need to work more on explaining the use of images and showing organisations the benefit of images. That is why I am so happy to collaborate now with Dominique Darmon.

With Evaluation 5.0 we developed a training on new ways of evaluation for donors, evaluators and organisations. Only two participants (evaluators) registered, so also here more work on PR is needed.

A new year is about to start. The path is good, it is ready to be beaten! I wish you all a wonderfull 2011 with loads of inspiration and steps towards genuine sustainability.

Posted in Methods, Power of images, Roles | Leave a comment