The phrase "evidence-informed practice" is used widely in the community sector. It is often invoked as a marker of credibility — a signal that an organisation takes its work seriously. But what does it actually mean to be evidence-informed? And how does that commitment translate into everyday decisions about program design, delivery, and evaluation?

These are questions we have been grappling with at Asset Community, and we want to share our thinking — not because we have all the answers, but because we believe transparency about how we approach our work is part of being accountable to the communities we serve.

Evidence-informed, not evidence-bound

The first distinction we want to draw is between being evidence-informed and being evidence-bound. Evidence-bound practice treats research findings as fixed prescriptions — if a study says X works, you do X, regardless of context.

Evidence-informed practice is more nuanced. It means using research and data as one important input into decision-making, alongside practitioner expertise, participant feedback, and contextual knowledge. It means asking: what does the evidence suggest? What do we know from our own experience? What are we hearing from the people we work with? And how do we make the best decision with all of that in mind?

What evidence we draw on

At Asset Community, our evidence base draws from several sources:

  • Published research on what works in employment services, adult education, and community development — including Australian and international literature
  • Program evaluation data from our own activities and from comparable programs in the sector
  • Participant feedback gathered through structured and informal channels throughout our programs
  • Employer and partner input on what is working and what is not from their perspective
  • Sector intelligence from our networks and from the broader not-for-profit and government landscape

Building our own evidence base

We are also committed to generating evidence through our own work. This means designing our programs with evaluation built in from the start — not added as an afterthought — and measuring outcomes that actually matter rather than outcomes that are simply easy to count.

Good evaluation is not about proving that you are doing a good job. It is about honestly understanding what is working, what is not, and what you need to change. That requires courage as much as methodology.

Sharing what we learn

We believe that not-for-profits have a responsibility to share what they learn — with the sector, with funders, and with the communities they serve. Knowledge hoarded is knowledge wasted.

As our programs develop and our evidence base grows, we intend to publish our findings openly — the successes and the failures alike. The sector gets stronger when organisations are honest about what works and what does not.

An invitation to collaborate

If you are a researcher, evaluator, or academic with an interest in employment, education, or community development, we would welcome the opportunity to explore collaboration. Our research and resources program is actively seeking partnerships that can strengthen the evidence base for our work and for the sector more broadly.

Get in touch at [email protected].

← Meeting people where they are