top of page
Writer's pictureVanesa Weyrauch

From a pyramidal to a networked approach to evaluation

Why do we use only certain types of knowledge when doing evaluation? Whose voice counts when assessing change and how do we capture this knowledge? I have been recently pondering on how going into the dynamic, rich and chaotic field where changes happen or not with a specific lens (i.e., theory of change, results-based frameworks, pre-defined criteria/questions) can limit our view and the possibility to more completely identify, assess and understand what we see. In a way, I feel I lose so much, but at the same time I am not sure about how to expand my practices within the current paradigm.

 

“Perhaps the secret of living well is not in having all the answers but in pursuing unanswerable questions in good company.” I love this quote by Rachel Naomi Remen. It is an invitation for us to stop and allow ourselves to not know, and to explore together questions that are rich, complex, and full of potential.

 

At P&I we are convinced about the value of activating collective intelligence as a way to co-create together new possibilities. With this spirit, I have recently participated in an amazing panel organized by Southern Hemisphere at the GLocal Evaluation week: “The vital role of evaluation in shifting power for transformational change - a perspective from the Global South”v. With bright and provocative colleagues as Dena Lomofsky, Mark Abrahams and Zimingonaphakade Sigenu we were able to open the floor to a very engaged group of participants who are also asking themselves many relevant questions about the current status and potential evolution of the evaluation field.

 

My reflection focused on the types of knowledge (evidence for more rigorous approaches) that are today accepted to demonstrate impact, and also how knowledge is conceptualized, generated and used. I invited everyone to think whether we could move from what I call a pyramidal approach to evaluation to a networked one (see figure below).




 

The first one illustrates an evaluation process where a small group of individuals (leaders of the commissioning organization and consultants/internal evaluators) define how to gather evidence around expected outcomes.

 

One of the major challenges with pre-defining outcomes is very accurately described by Marcus Jenal in his issue #91 of Gaining Systemic Insight: “Change initiatives aimed at improving lives often encounter the problem of reducing 'better life' into measurable outcomes. These improvements vary greatly depending on context, timing, and personal preference. Simplifying these aspects into single or even multiple measures inevitably loses context and nuance. Worse, an initiative might achieve real improvements, yet be deemed a failure if it doesn’t meet predefined measures. Conversely, positive reports might mask a lack of fundamental change in people’s lives.”

 

After the design of the evaluation, the small group of initiators then share this effort with some more members of the organization, then expand to partners, funders, peers/experts, and enlarge it more to engage those who participants who are contributors and/or benefit from the initiative. Usually, the ones at the bottom provide a wealth of data, information, knowledge which is then moved up the pyramid as it gets selected and interpreted by less and less persons. Sometimes there is a final report that is only for the eyes of the individuals at the top, and a public edited version of it that goes back to those who contributed-sometimes to all, sometimes to only some. This model has in reality a lot of nuances, of course, for example the application of sensemaking sessions where more participants are invited to help select and interpret preliminary findings.

 

The networked approach that I would like to further explore in good company views the evaluation as a participatory effort, where depending on factors we need to be aware of (interest, language proficiency, time available, etc.) all those who take part have opportunities to voice out their assessments and interact with others to weave together different types of questions, criteria, knowledge, etc. Some may play a larger role (wider circles) due to a horizontal recognition of what they can bring into the table. Different types of knowledge (for example, experimental knowledge, indigenous wisdom, body knowledge, etc.) are welcome as sources to better understand what changes have happened or not and why and are integrated through dialogue and co-sensing. The collaborative effort is documented by evaluators who perform as facilitators of the process and the final output is shared and further refined by all those who are interested in doing this.


A networked evaluation approach can enhance organizational agility, foster collaboration, and leverage collective intelligence. By breaking down silos, it allows for more comprehensive and diverse insights into performance and impact, leading to more informed decision-making and innovation.


Obviously, there are several pre-conditions to make this approach work well (i.e. imbalances in who can participate and how due to language, technological, economical and other barriers) but this does not mean at all that we could not explore it…human history is full of stories where these conditions have been looked upon, worked and changed.

 

“We need to be advocates for new ways of evaluating”, wisely and enthusiastically exclaimed one of the participants of the panel during our group work. There was a clear recognition of the need to become active protagonists in the MEL field, individually. There was also a promising awareness of the energy we can harness if we do this more collectively, so as to get inspired by each other, to better articulate and argue for our proposals and to learn from experimentation, at least with baby steps.

 

I am still ruminating on what to do next, this blog post series being one of the steps I felt I needed to make. Any comments/contributions/critiques to this potential change of model will be more than welcome! Objecting or questioning some of the points would be good company too!

 

43 views0 comments

Recent Posts

See All

Comments


bottom of page