If R:2025 were being externally assessed there would be meetings about Performance or Metrics or Indicators of Progress against Plan. In the previous post I described how Value and Impact of R:2025 could be thought of as operating at Personal, Public and Social levels. Progress can also be thought of at these same three levels. Progress in my personal wellbeing; progress in establishing, maintaining and being useful to a network of intermediaries; and progress in connecting to social issues in ways that open up opportunities for change.
This post tries a different line of thinking: Progress against planned developments in implementing the arrangements that will sustain R:2025 as a thriving, worthwhile set of activities.
For R:2025, as a whole, to move forward it needs to be making progress on six development strands:
- Having some ‘presence’ – an independent, visible existence
- Being known by others – recognised as interesting; promoted by them
- Being content rich – having enough ‘stuff’, of sufficient interest
- Being referenced by others within their own work – links and collaborations
- Being managed in ways that are realistic – sustainable, still fun to do
- Having sufficient resources beyond core – allowing creative activity to happen
If Progress is related to Purpose, there can be difficulties pinning down the progress being made in a situation where the ‘project’ is less precisely defined but is more exploratory – where the sense of purpose is developed as the activities go along. This will be less precise than the type of performance measures usually being demanded. It will be less linear and less reliant on boxes setting out what was being done, by when. It becomes less about tracking progress against fixed milestones and deliverables, and more about managing understandings: creating time and space for understandings to emerge, to be constructed as narrative – and yet be purposeful, deliberate, systematic and considered. There can still be the intention that any process maintains a degree of rigour.
This needs a regular mechanism for reviewing things that is a bit flexible; allows for variance from the preset route; involves ‘plausibility’ of the account being decided upon; which pulls together such evidence as might be available and tries to make sense of that. This is close to the evaluations of development programmes that were part of my work between 1990 and 2010 – evaluations that themselves developed as the programme unfolded.
The mechanism is better if supported by some framework that enables assessments to be done is a structured way that is developmental rather than rigid. It is probably going to be strong on self-assessment, possibly with the option for some moderation together with others who understand what is being attempted. It is also probably going to be less time-bound than usual. It will be able to take a medium-term view eg over a 3 or 4 year timescale, with ongoing assessment able to be made annually, or every few months, or whenever seems most appropriate.
One possible Thinking Tool that can allow this to happen is to have a set of potential futures linked to the bullet points above. This would allow intended progress to be sketched out as ‘Here is where things will get to on a good day ….’; Here, however, is where things could end up if things don’t progress as hoped…’.
Between these two extremes are a whole variety of potential lines of progress. This set of futures can act as measures for tracking where actual progress is taking things. Following the trajectory of R:2025 against the potential futures will also enable the identification of things to put more emphasis on in order for decisions to be made about how R:2025 should move forward.
The various stages and positions as things progress are more Waymarkers reached than Milestones passed. The whole thinking through of what progress has been made; how that has been done; where things are adrift from original intentions, and the extent to which that matters – all is more akin to stocktaking, documenting, explaining, archiving, storytelling than it is to assessing, measuring and passing/failing.
A diagrammatic representation of this follows: