As the grantmaking world comes to grips with the rapid rise of outcomes measurement, there's no shortage of experts, but who knows what really works?
We draw your attention to the keynote speaker at the "Prepare for Impact" Grantmaking in Australia conference, Rory Gallagher. He heads the Australian branch of the Behavioural Insights Team (BIT), a social-purpose company part-owned by the UK government.
BIT began as the world's first government institution dedicated to the application of behavioural sciences, but its lessons have been increasingly applied here in Australia, including by the NSW government and more recently the Vincent Fairfax Foundation.
Mr Gallagher explains that BIT aims to improve outcomes by developing policy based on a realistic understanding of human behaviour. That approach stresses an outcomes focus, and requires a high standard of evidence.
Dr Gallagher spelt out an assessment practice he summarised as "test, learn and adapt".
He said good empirical data generates insights into human behaviour that will better direct funding, based on evidence.
Millions of dollars in funding wasted
Dr Gallagher busted false assumptions that could lead to failure, such as the US-based "Scared Straight" program that sent ex-cons to speak with juvenile delinquents to put them on the right path, only to increase crime.
And he highlighted the vast number of government-funded programs - and the wasted millions, or even billions of dollars - that couldn't be shown to work.
He cited a 2014 UK study of programs that showed a "weak or no positive effect" in the following proportions:
Dr Gallagher said those serious about wanting to know whether a program was effective should apply randomised control trials (RCTs), where a test group could be compared with a control group unaffected by the proposed intervention.
Medical research has led the way in the use of RCTs since adopting the method in the 1950s.
Nowadays, we wouldn't expect to be offered a drug that hadn't been rigorously checked. Yet the same cannot be said for many of the social sector programs that win grants. That trend is changing.
Where are you on the hierarchy of evidence?
Dr Gallagher described anecdote as the lowest standard of proof or evidence, yet it's not unusual to find funders and government agencies that "cherry pick" positive results "alongside a photo op" to suggest a program is effective, he said.
Yet how would anyone know that a particular intervention was the cause of any particular outcome, such as a jump in employment rates, without a control group to demonstrate what had happened to a group that hadn't received the intervention?
While expert opinion was somewhat more compelling than anecdote, "before and after" studies or comparison studies without randomisation were higher in the chain of evidence, Dr Gallagher said.
BIT's hierarchy of evidence
Randomised control trials and systematic reviews are at the top of the "hierarchy of evidence", and these are are the only ways to be sure your program has had the effect you're claiming, or that the grants you have distributed have really have the desired effect, he said.
He told conference delegates that evidence trumped "common sense" and expert analysis, and could help reduce the influence of politics, media and personality in decision-making.
If you're not an AIGM member, join here to join for as little as $280-a-year, and receive Grants Management Intelligence, access powerful online grantmaking tools, a 10% discount to our annual conference and other offers, and connect with a great network.
Are you a SmartyGrants user? Your organisation is eligible for 10-free memberships. More info here
Leading Australian philanthropist Alan Schwartz is tackling one of the hardest challenges the planet faces: to put a true value on the social and natural capital of the world, including health, literacy, trust, clean water and biodiversity.
An abridged version of Gary Banks' address for the Alf Rattigan Lecture for the Australia and New Zealand School of Government (ANZSOG) that points the way for what's worked in the past, and what can be done to avoid policy on the run.
Leading social impact thinker Ross Wyatt says many funders and grantseekers are trapped by evaluations aiming to prove what they did was right. Here's how to do better.
Our Community's Chaos Controller and executive director Kathy Richardson examines how we might create a sector where there are incentives for using evidence.