Learning in Complex Contexts with Design-Driven Evaluation

If you’re stuck trying to connect your ambitions, your programs, and evidence for use in complex contexts, traditional evaluation models won’t get you there.

Design-driven evaluation might.

Complex contexts involve multiple sources of input, changing conditions, and different timescales. They bring together uncertainty and a resistance to rigid, linear plans and to the use of evidence. This might mean that connections between cause and effect are limited or opaque. It also limits what you can do to affect change with the evidence you collect if you use a traditional approach.

Design-driven evaluation brings together systems thinking, utilization-focused evaluation (U-FE) with design thinking. It is inherently connected to strategic design.

Where U-FE asks “what do users want?”, design-driven evaluation asks “what do users actually do, and what could be possible?” It draws on design research — observing behaviour, testing assumptions, and looking beyond stated preferences — to create evaluations that are genuinely fit for use, not just intended for use.

What Design-Driven Evaluation Looks Like 

Design-driven evaluation has six core characteristics:

  • Holistic — design thinking is applied to the entire evaluation, not just the final report. It’s an approach that views evidence and its use as a symbiotic set of activities that are intended to inform, guide, and apply the evidence generated to make meaningful program decisions and support innovation.
  • Systems-oriented — considers a broader ecosystem of potential users and stakeholders beyond the immediate ones. A DDE approach is grounded in systems thinking and seeks to place evidence and its use within the contexts we define. This means creating fit-for-purpose approaches that are aligned with the systems we’re seeking to learn in.
  • Process and outcome focused — committed to outcomes but open to adapting the route. The aim is to generate evidence, while recognizing that the context in which that evidence is being generated, made sense of, and used is changing. That’s why DDE is well-suited for complex, dynamic situations.
  • Aesthetics matter — not about making things pretty, but making them attention-worthy and suited to how people actually engage (e.g., a podcast or blog series instead of a report) with evidence and strategic recommendations. It’s designing evidence and its products that fit with the intended audiences and users of that evidence up front. Knowledge translation and evidence use isn’t an afterthought, but part of the process and the outcomes by design.
  • Informed by research on the evaluation itself. Design-driven evaluation and research consider the environment and context in which the evaluation is taking place. The use cases and contexts are continually being considered. This means DDE must be anchored in a plan and an adaptive mindset around it.
  • Future-focused — uses strategic foresight to design evaluations that will remain useful as circumstances change, with the anticipation of future situations and use-cases informing the design.

Connection to Principles-Focused Evaluation 

Design-driven evaluation is also connected to principles-focused practice, particularly around accessibility — designing not just for cognitive and physical access, but for inclusion in the real rhythms of how people work and make decisions. As a practice, DDE is meant to make evidence and the means of using it to inform innovation, strategy, and adaptation more accessible to everyone involved in the decisions and outcomes associated with an evaluation.

DDE is not a common approach to evaluation, but it is powerful. If you’re looking to innovate and make sustainable use of your evaluation data to improve your impact, reach out.

Reference: Norman, C. D. (2021). Supporting systems transformation through design-driven evaluation. New Directions for Evaluation2021(170), 149–158. https://doi.org/10.1002/ev.20464

And the direct link to the article on Wiley Online Library: https://onlinelibrary.wiley.com/doi/abs/10.1002/ev.20464

Scroll to Top