Emergency material and financial aid project

Over the last few months myself and colleagues at the Public Service Research Group have been working with the ACT Government around their Emergency Material and Financial Aid (EMFA) program.  The ACT Government is considering the nature and effectiveness of its EMFA program and we undertook an evidence review into EMFA programs, highlighting key tensions and issues in relation to Emergency Relief.  In this review we set out an overview of the types of services that these programs comprise, the evidence base around their effectiveness and the types of challenges and issues that EMFA services encounter.

Earlier this year we used this evidence review to facilitate a discussion with a range of stakeholders in the ACT around the degree to which local patterns in EMFA services match those in the literature and what might be done to further develop existing EMFA services.  The ACT Government is now embarking on a co-design approach that will seek to draw on key stakeholder perspectives and best practice in terms of the evidence base.

You can find the full evidence review here.

Advertisements

Evaluating outcomes in health and social care

Back in 2008 I wrote a book called Evaluating outcomes in health and social care, which was part of a 5 book series called ‘Better Partnership Working‘ aimed at students and practitioners working in and around health and social care environments.  This particular book focused on reviewing the evidence for collaborative working and provided an overview of the different approaches used in evaluating joint working arrangements.

Last year Janine O’Flynn and I revised and updated the initial book and published a 2nd edition of the text.  We updated the evidence base and expanded this to take more an international perspective on the issues.

The reviews have started filtering through for this text (and for the broader series) and I will share these here as they emerge.  The first is from Emma Miller at the University of Strathclyde and who has worked extensively in the evaluation of collaboration and health and social care outcomes.  Her review of the book can be found here and in this she provides a very helpful overview of the content for those who might be interested in this text.

 

 

Evaluating collaboration with POETQ

I have written on this blog a number of times about the very good fortune that I have to work with some great people and one of my most recent publications was co-written with a fantastic former colleague Stephen Jeffares (Jeff to most who know him).  I can’t talk enough about how much I enjoy working with Jeff who is a font of amazing and wonderful ideas and also a cracking musician.

Jeff and I did PhDs at the University of Birmingham around the same sort of time on broadly similar sorts of areas although in different departments and didn’t really know one another until we both took up academic roles at Birmingham.  When we got chatting about our work we discovered that both of us had created approaches to evaluating partnerships, albeit from slightly different perspectives.

We both had identified limitations to our work and had a desire to keep certain aspects and improve on others.  We both agreed that some sort of online tool that was able to explore certain process aspects of collaborative working was helpful (a feature of my POET tool) and that a process that would facilitate an exploration of the range of different perspectives relating to the aspirations of collaborative working was also needed.  In relation to the latter, Jeff had worked with a Q methodology approach in his PhD and found this helpful in exploring different perspectives on collaborative working.  I had struggled in my PhD to get professionals to articulate their aspirations for collaborative working in a satisfactory sense and could see a lot of value in the process of the Q methodology approach in overcoming some of the limitations of my approach.

We agreed to combine the insights gained in our respective PhD projects and apply these to a new tool for evaluating collaborative working.  We had the good fortune of being successful in a funding application to evaluate the outcomes of joint commissioning approaches in England which afforded us a small amount of money to do some development work on the new tool.

I learned an important lesson at this point that getting some money to support the development of an online tool is the easy bit!  Yes it may have taken 3 months to craft a 30 page research funding submission but this was nothing compared to the challenge of trying to give some money to someone in return for what we wanted.

We met with a whole range of different people who do development work but were not really happy with any of the ideas.  Most of the conversations seem to revolve around why it was too difficult to do what we wanted to do within our budgetary constraints.  To be clear we weren’t really asking for that much, just not to have an automated form – which is what most people wanted to develop for us and we could have just done ourselves.

Jeff had the great idea of advertising in the computer science building for students and this is how we met Greg.  Greg said yes to all of our stupid ideas, was bright and really knew his stuff (not that we did, but he was headhunted to Apple straight out of his undergrad degree and moved off to California which proved it for me).  Greg did some great work and between us we eventually got to the POETQ tool that exists today.

We used POETQ in the joint commissioning work and also in a number of other evaluations of collaborative working arrangements.  Jeff does a lot of work with graduate students and academics interested in Q methodology and we make this online application (which is considerably quicker than a manual sorting process) available to pretty much anyone who will talk to us (nicely) and will use it for good.  So although originally designed to evaluation collaborative working, POETQ can be customised through the statements it contains and the questions it asks to be used on a whole host of different topic areas.  Before this I don’t think a fully automated Q sorting system existed online in such an accessible way so has garnered attention for this community (both positive and negative).

Anyway, the paper that Jeff and I just published in Evaluation is, I guess, an academic version of the story of how we got from our separate PhD projects in broadly a similar area to developing POETQ, what different ideas underpin the tool and a little about our application in the joint commissioning project and some of the challenges with evidence and complex policy initiatives.  You can read the full version here and we’d be keen to hear from anyone who has used this type of approach in evaluating collaborative working or who might want to use this tool in their research.

The rise of experimental government

I just came across this blog from David Halpern (National Adviser on What Works and CEO of Behavioural Insights Team, UK) which is concerned with the lack of empirical evidence used to guide decision making in many areas of public services.  In the UK a series of ‘What Works’ centres have been established to generate ‘good empirical studies’ in a range of different welfare service areas.

Now I am all for using more evidence to inform policy and practice (working in a university how could I be against this?).  Although I suspect that in some of the areas that the What Works teams are talking about it is not necessarily a lack of evidence that is the problem but weighing up a series of complex value judgements.  These are typically not easily resolved by more evidence about effectiveness.

Anyway, I have written about this blog not to get into a debate about evidence and decision making and if you have been in one of my policy design and implementation classes you will have heard all about my stance on this.  I raise it instead because of the last section of Halpern’s piece which talks about the idea of radical incrementalism.  Halpern explains:

‘Radical incrementalism is the idea that dramatic improvements can be achieved, and are more likely to be achieved, by systematically testing small variations in everything we do, rather than through dramatic leaps into the dark. For example, the dramatic wins of the British cycling team at the last Olympics are widely attributed to the systematic testing by the team of many variations of the bike design and training schedules. Many of these led to small improvements, but when combined created a winning team. Similarly, many of the dramatic advances in survival rates for cancer over the last 30 years are due more to constant refinements in treatment dosage and combination than to new ‘breakthrough’ drugs. Applying similar ‘radical incrementalism’ to public sector policy and practice, from how we design our websites, to the endless details in jobcentres to business support schemes, we can be pretty confident that each of these incremental improvements can lead to an overall performance that is utterly transformed in its cost-effectiveness and overall impact.

 

Helen Sullivan talked about this very same idea in the Imagining the 21st century public servant workforce report which we published last year. What we argued in this report was that if the approach of radical incrementalism is to be effective then the first task must be to ensure that we all have a sense of what we are working towards.  We both use the example of British cycling and there I guess the aim is to go faster for longer.  The aims in areas of local economic development or early education might be slightly more complex.  Without a sense of strategic aim then evidence cannot play the role in the process that it might.  In our research into the Australian public service one of the things we heard frequently was the lack of strategic oversight and horizon scanning – and I don’t think Australia is alone in this.  Having established where we want to go, having good evidence to back up this process and to help us track progress is, of course, absolutely crucial.

I would like to hear from any individuals or teams who have experimented with this notion of radical incrementalism in recent practice and hear about your experience with this.  Did it turn out to be as radical as you had hoped?  Did you reach the end point?  What helped and hindered this process?

 

 

 

 

 

POETQ

For my PhD research I invented an online evaluation tool to use within health and social care partnerships.  This was called the Partnership Online Evaluation Toolkit – or POET as it became known.  Whilst POET served the purpose for the PhD there was quite a bit I wanted to do to improve it before using it again in broader studies.  My fantastic colleague Stephen Jeffares had also been working on the evaluation of collaborative working and had used a Q-methodology based approach in his PhD research.  POETQ (the addition of a Q methodology approach to the POET tool) came about as an integration of the work that Stephen and I had done.

Stephen and I were part of a team that received funding from the UK National Institute of Health Research to investigate the topic of joint commissioning and we proposed to use POETQ as one of the major tools of data collection.  We hired a web designer, Greg Hughes (who at that time was a University of Birmingham undergrad student and now is based in San Francisco working for Apple) to help us develop a better application to deliver POETQ through.  You can find out more about POETQ and see screen shots and a video of it in action here.  If you are interested in using this as a research method then please get in touch – we can make it freely available to people with certain affiliations.  You can read more about how we used POETQ in the joint commissioning research report which you can access here.  It is also mentioned in Stephen’s great book Interpreting hashtag politics which is pictured below and which I thoroughly recommend to anyone interested in the rise and fall of policy ideas.

hashtag politics