Moving out of the projects - reflections from a learning associate role
Part 1: How to start well
These reflections, drawn in part on my practice experience with the Joseph Rowntree Foundation (JRF) Emerging Futures team, are only from my perspective.
What am I writing about?
This is the first in a series of reflective pieces, inspired by my time as a learning associate with the JRF Emerging Futures team. They are not organised as a linear description of what we did together, more as general groupings of thoughts that I find myself returning to as I try to develop my practice to be better suited to complex, systems and transitions focused work. In this first piece I reflect on ‘how to start well’.
I am sharing these reflections because I find it really helpful to hear from other practitioners, for example through the UNDP M&E Sandbox (or some more recent recordings), West Michigan University Evaluation Café (scroll to the dates for recordings of past events) and UK based Charity Evaluation Working Group (CHEW).
I also hope they are helpful for people commissioning, funding and managing learning and evaluation support. It can be challenging to negotiate new ways of doing things within an organisation, to shift expectations around what the role, process, timeframes and outputs of learning and evaluation activity will be.
Where am I coming from?
I have worked in evaluation and learning roles for more than twenty years, mainly in the UK, and a little in the US, and find myself in the interesting position of being both experienced and a complete novice. As an evaluator I was raised in ‘the projects’, applying more traditional approaches to evaluating the impact, process and value for money of discrete interventions. These clients were often UK Government Departments, and since then I have worked more with UK based Trusts and Foundations, and some smaller front line service delivery organisations.
In more recent years, along with many other evaluation and learning practitioners, I have been working out how to relearn and adapt my practice to better accommodate programmes of work which are, in different combinations, embracing complexity, taking a systems approach, centring equity and focused on transitioning to futures that are fairer and more sustainable for the planet.
There are many rich sources of new thinking and practice available to reflect on and learn from (some of which I will link to in later pieces particularly when reflecting on ‘methods and practices’, to build on these resources from 2022). The extent to which practitioners like me can apply what we are learning to our practice is enabled or constrained by the work we are commissioned to do, or the parameters and permissions in the role we have within an organisation.
My role with Emerging Futures
Between April 2023 and December 2023 I had the opportunity to try and apply some of my own learning to practice, through my associate role with the JRF Emerging Futures team, supporting them with how to approach their initial learning cycle.
At the time of my involvement, Emerging Futures was structured around supporting four parallel tracks of activity, all with a focus on transitions to fairer futures that are more sustainable for the planet. These tracks included imagination infrastructure; reshaping philanthropy and investment; support for pathfinder organisations, who are actively building alternative futures in their communities; and for visionaries, individuals developing ideas that draw us away from the status quo and orient us towards something new.
Emerging Futures is a big programme of work; big in terms of range, depth, ambition, people involved and as a new way of working for JRF. It was complex in itself, and was embracing the complexity and emergent model of systems change involved in work aimed at supporting the transition to a fairer future.
My initial role was defined as much about what it wasn’t than what it was. I was not there to develop a ‘test and prove’ approach, comparing grantees and assessing which should be taken to ‘scale’. I was not commissioned to conduct an independent evaluation of the impact of the programme and inform the next stage of design. Nor was I to help implement a monitoring framework, to judge the progress of the team and / or the work against a set of pre-determined KPIs.
My role was to help design an approach to an initial learning cycle, by drawing on new thinking and practice better suited to complex, systems focused work, than more traditional approaches to monitoring, evaluation and learning.
From early on it was clear that my role would need to evolve in response the realities of a very newly in place team, and rapidly evolving work. It was impossible, for example, to define my contract in terms of milestones, describing exactly what activities would be delivered by when; and my role evolved significantly, for example, from designing an approach with the team, to supporting team members with specific practices for learning and reflection.
3 reflections on how to start well
1. Understand the purpose
Going into the Emerging Futures role, I took a familiar question with me
What is the purpose of this work we are going to be doing together?
Framed specifically for the Emerging Futures context as, to help me support the design of your approach to learning:
· What is it you want to learn about & why?
· Who are the intended users of this learning, and how will they use it?
· Who is ‘you’ here? Whose perspectives am I hearing, and whose are missing?
Whether or not a team does this work independently or is supported by someone in a role like mine, answering these questions takes time. The time goes on:
· Understanding who gets to decide the purpose, incl. the often multiple agendas and the power each of them have to set direction and influence each other; and who and how these purposes might enable or disempower.
· Giving space to hear different perspectives; for minds to change; for understanding to grow and for consensus to build and / or acceptance where there isn’t complete consensus on purpose.
· Everyone being able to explain the purpose of the learning.
· Getting sign off on the starting point.
· Clarifying the process for reviewing and adapting the purpose.
At this stage we won’t be focused on how we are doing the learning (learning questions or frameworks, methods and practices); who is doing the learning (roles, responsibilities and commissions); or what the learning outputs or artefacts will be. All of which are where people are usually most keen to start! To start well we need to spend time understanding why we are doing what we are doing.
So far, so familiar, I would be thinking about questions similar to theses at the start of any MEL activity, as part of getting to know the people involved and building relationships.
2. Don’t skip the briefing
I compete in long distance, open water swimming. At an event you don’t just line up at the start, jump into the river and go. Before you even get to the starting line, and regardless of your previous experience or expertise, everyone goes through the same race briefing. Someone who really knows the landscape gives us lots of useful up to date insight, about what to watch out for, what could help us, what is different from last year, what the tide is doing, where to pick up sustenance on the way, and so on. After the race briefing participants stay penned up together, and we add to what we’ve learned from the official briefing, by sharing our own experiences, good and bad, and generally encouraging each other that what we are about to do is not only possible, but fun!
The experience with Emerging Futures really clarified for me the need for a similar briefing stage as part of evaluation and learning planning, before everyone charges over the start line and into all of the ‘doing’.
For work like Emerging Futures, traditional programme evaluation, measuring predetermined outcomes against a linear theory of change, or reporting to KPIs are not a good fit, and potentially undermine the work itself. Evaluation and learning practice is evolving in response. New thinking , methods and practices proliferate, definitions of rigor and hierarchies of evidence are being questioned, new blends of disciplines, a focus on how to centre equity, new ways to work with data, means a new landscape of options to navigate.
While there are these options to draw from, and some very general areas of consensus about how to proceed — importance of embedding evaluative thinking, building learning cultures, expanded / different roles for evaluation practitioners, involving more perspectives, thinking in terms of systems, new ways of thinking about causality — there are no established ‘best practice guidelines’ or ‘gold standard’ approaches to point to as ‘the’ way forward. We are all on a journey of learning about learning for complex, systems focused transitions work.
When there aren’t clear directions for a new way, it is tempting to follow a well-trodden path.
People’s previous experiences, or preconceptions often inform what they understand ‘evaluation’ or ‘learning’, to be, to look and feel like, often regardless of the nature or context of the work being explored. Understandably keeping up with the latest from evaluation and learning theory and practice is not something many people have the time or inclination to do if it’s not part of their day to day role.
A briefing stage can make time, at the beginning of new work, to do two things. First, to surface, share, and expand on preconceptions about evaluation, for example, asking ‘where do our starting points come from, and who do they serve?’, as a way to process previous experiences, build curiosity, hear different perspectives, and question the status quo.
Second, having created some space to start thinking in new ways about what’s possible, a briefing stage is an opportunity to introduce some of the options in more detail, to increase awareness of the professional, credible fields of practice that are developing alternatives to traditional ways of approaching evaluation and learning.
I think it might be better to access these insights and options, not through a page or so in a tender document, but through discussion and questioning, and at a point when everyone involved has a deeper understanding of what we are learning about and why.
Is this an important new role for practitioners like me? Helping people navigate a new landscape by facilitating a briefing stage, to help organisations take advantage of the full range of evaluation and learning practices on offer, with confidence that they are just as credible as the ‘traditional’ ways of doing things.
How do we access latest thinking and examples, and judge what to bring to the groups we are working with? How do we calibrate the amount of detail needed for different audiences? And how do we support organisations to tell as confident story about how their approach to evaluation and learning might not meet expectations, and be better for it.
What does it mean for project planning? For people to give attention to thinking about evaluation and learning in new ways, how much time do we need, and at what stage in the process (before writing the job description or signing off the commissioning contract)? Who needs to be in the room? How do we persuade people to give up precious resources (time, attention, budget) to a briefing stage?
3. What do the words mean?
Emerging Futures is also a big programme of work in terms of the concepts and mental models it is concerned with.
For lots of evaluation and learning activity ‘starting well’ involves clarifying what people mean by the words they use to describe their work, and what the work is trying to change. Unless we are designing survey measures that needs to be valid and reliable across time and populations, we might not be seeking absolute precision in definitions, or single definitions, or definitions that can’t change. But getting into the habit of being clear about how core concepts are being used is necessary if we are to be clear about what we are learning about at a particular point in time.
Discussions around meaning can also be a a useful way into surfacing and exploring implicit assumptions built into work. As an example of how meaning matters, Jewlya Lynn and Julia Coffman discuss the need to interrogate and align mental models on systems change with the strategies, tools, and approaches to learning that match, so change isn’t approached in counterproductive ways.
My ‘how to start well’ reflections end with a plea to be realistic about what is involved in terms of investing time up front, and thinking about the timing of learning and evaluation input.
It takes time to understand the purpose for learning and evaluation activity; to brief those involved on the options available; and to get clear on the meaning of commonly used words. This is the work of learning and evaluation, just as much as, for example, identifying questions to explore and deciding on methods.
In the following pieces I am planning to reflect on methods and practices, and what we need to understand about the contexts we are working in, if we are to support learning and evaluation activity that is useful.