Policy process advocacy – is the desirable, feasible?

A few weeks ago, the co-occurrence of a ‘twitter discussion’ (initiated by @LukeCravenand some reading I was doing on policy analysis tools prompted me to start thinking about what Hill (2013) refers to as process advocacy.  Process advocacy is concerned with improving the nature of policy making.  It is different to policy advocacy in that it concerns advocating generally for ‘better’ policy process rather than the substance or content of a particular policy.

 

I haven’t spent ages detecting every nuance of the history of process advocacy, but it strikes me it has been going on for a very long time.  The ‘policy analysis’ movement which started in post-war USA drew on the belief that social science (then positivist) and economic analysis techniques could make policy better.  Turnpenny et al (2015) highlight that researchers and policy practitioners have developed a range of tools to help with the formulation of policy and a lot has been written about individual tools and how they should be used (I’ve thought hard about how I can subtly introduce my favourite jargon I have learned in this reading – but I am just going to say it – this focus on tools and analysis is referred to as ‘analycentric’  ).  For the most part tools support their users to understand the situation (problem) and consider the cost, effectiveness and impact of possible interventions. Turnpenny et al (2015, p.21) introduce a typology of policy formulation tools which gives examples of tools that are helpful in different policy formulation tasks – problem characterisation/evaluation, specification of objectives, options assessment/policy design.  The interesting thing about Turnpenny et al’s typology is that none of the examples of tools are what I would characterise as ‘systems tools’ (It’s odd this because ‘systems analysis’ and ‘policy analysis’ were born in the same post-war fever for rational analysis and every so often I have spotted that systems thinkers and policy process advocates have definitely ‘rubbed shoulders’ and drawn on similiar intellectual traditions but maybe that should be the topic of another blog).

I have used policy analysis tools as an example here but there are other ‘live’ forms of process advocacy.  For example, the interest in evidence-based policy for example leads to the idea that policy practitioners should consult or even conduct systematic reviews of literature to inform the way they proceed.  In addition, the interest in participatory democracy has also led people to advocate for the use of participatory techniques, such as citizens juries and participatory budgeting.

Systems practitioners, including myself, advocate for ‘systems approaches’ or ‘complexity-friendly’ approaches to be used in the policy process.  We do that because our experience (which may include doing or reading formal research) leads us to claim that they are helpful for understanding, and considering how to act in, situations which are characteristed by interdependent variables, multiple perspectives and ethical/political conflict.  A lot has also been written about these tools and how they could be used – either in books focussing on a single approach or in books compiling a number of approaches.  The tools aren’t only written about from the perspective of their use for policy (they can also be used for strategy development in organisations) but many case studies are given of analysing situations where government/public attention and action is expected.

In advocating for the use of ‘systems approaches’ in the policy process, we ‘compete’ with those who advocate for the use of econometric analysis techniques, forecasting, risk assessment, cost-benefit analysis and so on.  We also ‘compete’ with those advocating for the use of tools supporting evidence-based policy making.  We also ‘compete’ with those advocating for the use of participatory techniques.  In other words, we add to the ‘noise’ of people claiming they have a ‘better’ way of doing policy.

Turnpenny et al (2015) comment that little is known about how policy analysis tools are actually used in practice.  This is what resonated when I saw Luke’s tweets – we also know very little about how systems tools are actually used within the policy process.

Image 1 - PTPblog.png

Political scientists, such as Paul Cairney (blog https://paulcairney.wordpress.com/), have argued that if we would like ‘science’ to be considered in policy (as is the case with the evidence-based policy movement) then it is important to pay attention to what political science understands about how policy is made.  I think the same is true when we are advocating for approaches/tools to be used as part of ‘better’ policy process – you have to consider the real-life work and context of those involved and other normative expectations that shape their work.

As I have been recently reviewing literature on the work of policy practitioners, I have seen some interesting insights into the use (or non-use) of approaches and tools…

In Canada, there have been a series of surveys investigating policy capacity at different levels of the multi-level governance system.  In the survey is a question which asks respondents how frequently they use a range of policy analysis tools, ranging form ‘soft’ techniques such as brainstorming to ‘hard’ techniques such as quantitative modelling.  The studies consistently find that practitioners use informal and simple techniques, such as brainstorming or checklists, much more frequently than formal complex ones (Howlett 2009a, Howlett 2009b, Howlett and Newman 2010, Bernier and Howlett 2012, Craft and Daku 2016).  This informality is also identified in a qualitative study carried out in Australia – Gleeson (2009) notes that practitioners describe undertaking analyses but there “was little evidence of the formal application of policy analytic techniques described in the literature” (p.142).

A different study of public servants in Quebec, Canada identified that 58% of respondents had never heard of systematic reviews and just 19% had consulted one in the previous twelve months (Bédard 2015).

The same sort of picture is found in relation to participatory techniques.  Cooper and Smith (2012) interviewed participation practitioners who work in a consultancy role in Britain and Germany.  They identify a wide range of participatory techniques that the practitioners draw on.  However, they comment that the text-book forms of these techniques aren’t really used.  Principles and tools are often adapted and blended with others during a piece of work.

So, policy practitioners do not do what the approaches/tools oriented literature advocate that they should do.  No authors seem to go on to question the tools/approaches and whether what they have said is ‘desirable’ is actually ‘feasible’.  However, the question does arise with respect to why? why is it that policy practitioners don’t use these desirable tools that can help them contribute more effectively to a better policy process?  There are some comments in the literature about issues of education and training – in short whether – in the eyes of the researcher – policy practitioners are ‘qualified’ to do what they do.

But I think it is important to take the context into account.  Policy practitioners work in an environment where they need to respond to the changing needs and expectations of political or managerial leadership (which are often in turn influenced by public/media opinion), therefore their workload is constantly being re-prioritised (Baehler and Bryson 2008; 2009).  Policy practitioners spend a considerable amount of their time engaging in firefighting (Wellstead et al 2009. Howlett and Newman, 2010) meaning they don’t necessarily have the time to do work that requires in-depth technical focus and coordination.

Furthermore, even when policy practitioners do manage to create the space to draw on their systems thinking knowledge, they have to account for their work using different narratives.  A study of a natural resource management project in Australia identified that the public servants aspirations for a way of working informed by constructivist, soft systems principles were ‘subverted’ by project management and evaluation practices associated with dominant public administration practices (Boxelaar et al 2006).

It’s only a small set of insights but these research studies do resonate with my own experience as a policy practitioner.  As someone who has had formal training in systems approaches, I rarely identified a context where I could design whole pieces of work explicitly introducing them to others and using them.  They require a context where people have space to develop a new language, to think differently and to challenge the context within which they work – these sort of spaces are a luxury I was never able to create.  In the volatile world of policy work, the ‘tortoise’ approaches unfortunately get little space.

However, it is crucial to point out that I did ‘use’ systems thinking every single working day – my knowledge of systems concepts, ideas and approaches gave me a language and a wide range of heuristics that I used as part of my sense-making and in informal interaction with others.  For me, systems thinking was an integral part of what Maybin (2013) identifies as the ‘understanding and thinking’ practices engaged in by civil servants. They weren’t however part of what Maybin (2013) refers to as ‘legitimating and justifying’ practices – the more public-facing part of policy.

So, now I have got to the point when I am pondering these questions…

…why is it that systems tools/approaches get so little airing in mainstream texts on policy analysis tools?  Do policy studies researchers and systems researchers ‘rub’ shoulders or engage in debate often enough? If not, why not?

…what do systems practitioners mean when they advocate for systems thinking to be ‘used’ in the policy process?  Is it about the use of approaches/tools? Or is it about the cognitive processes and attitudes of those involved in policy?  Or is it about the interaction of both?

…if systems practitioners do think that the approaches/tools should be used, how can they be adapted for the ‘real-world’ of policy work?  How can they not just be desirable but feasible in a fast-paced, volatile context?

…how do those engaged in growing systems competence of individuals do so in a way that recognises the ‘real-world’ of policy work?  Is it right to advocate the use of a single approach and train people in its use when novice systems practitioners are in a context which is not conducive to the use of the approach?  Only doing this is setting people up to either (a) feel a failure or (b) get disappointed with systems approaches or (c) both of those.  It seems to be that it is more important to support novice systems practitioners to draw on and blend different systems ideas and approaches and use them ‘internally’ as part of sense-making, than advocate the ‘pure’ use of any single approach.

…I am sure there will be more…

This post by Helen Wilding is re-posted and you can find the original here.

References

Baehler, K. and Bryson, J. (2008), Stress, Minister: government policy advisors and work stress. International Journal of Public Sector Management, 21(3), pp.257–270.

Baehler, K. and Bryson, J. (2009), Behind the Beehive buzz: Sources of occupational stress for New Zealand policy officials. Kōtuitui: New Zealand Journal of Social Sciences Online, 4(1), pp.5–23.

Bédard, P.-O. (2015), The Mobilization of Scientific Evidence by Public Policy Analysts. SAGE Open, 5(3). Available at: http://sgo.sagepub.com/content/5/3/2158244015604193.abstract.

Bernier, L. and Howlett, M. (2012), The Policy Analytical Capacity of the Government of Quebec: Results from survey of officials. Canadian Political Science Review, 6(2–3), pp.281–285.

Boxelaar, L., Paine, M. and Beilin, R. (2006), Community engagement and public administration: Of silos, overlays and technologies of government. Australian Journal of Public Administration, 65(1), pp.113–126.

Cooper, E. and Smith, G. (2012), Organizing Deliberation: The perspectives of professional practitioners in Britain and Germany. Journal of Public Deliberation, 8(1). Available at: http://www.publicdeliberation.net/jpd/v ol8/iss1/art3 [Accessed November 5, 2016].

Craft, J. and Daku, M. (2016), A Comparative Assessment of Elite Policy Recruits in Canada. Journal of Comparative Policy Analysis: Research and Practice, pp.1–20.

Gleeson, D. (2009), Developing policy leadership: a strategic approach to strengthening policy capacity in the health bureaucracy. PhD Thesis. Australia: La Trobe University.

Hill, M. (2013), The Public Policy Process Sixth Edition., Harlow: Pearson Education Limited.

Howlett, M. (2009a), A profile of B.C. Provincial Policy Analysts: Troubleshooters or Planners. Canadian Political Science Review, 3(3), pp.50–68.

Howlett, M. (2009b), Policy Advice in Multi-Level Governance Systems: Sub-National Policy Analysts and Analysis. International Review of Public Administration, 13(3), pp.1–16.

Howlett, M. and Newman, J. (2010), Policy analysis and policy work in federal systems: Policy advice and its contribution to evidence-based policy-making in multi-level governance systems. Policy and Society, 29(2), pp.123–136.

Maybin, J. (2013), Knowledge and Knowing in Policy Work: a case study of civil servants in England’s Department of Health. Edinburgh: University of Edinburgh. Available at: http://kingsfundlibrary.co.uk/publications/maybin_phd_thesis_2013.pdf[Accessed February 12, 2016].

Turnpenny, J., Jordan, A.J., Benson, D. and Rayner, T. (2015), Chapter 1: The tools of policy formulation: an introduction. In A. J. Jordan and J. R. Turnpenny, eds. The Tools of Policy Formulation: Actors, Capacities, Venues and Effects. Cheltenham, UK: Edward Elgar Publishing, pp. 3–30.

Wellstead, A.M., Stedman, R.C. and Lindquist, E.A. (2009), The nature of regional policy work in Canada’s federal public service. Canadian Political Science Review, 3(1), pp.34–56.

Power to Persuade