Creating Design Pathways for Learning

Capturing learning requires a focus on the journey, not end. Thinking like a designer can shape what we learn and how.

Learning is both a journey and a destination and it’s through recognizing this that we can better facilitate intentional, deliberative learning to support innovation and development. By approaching this journey through the lens of service design — as a design-driven evaluation — we can better design the data and insights that come from it to support learning.

What is learning?

Learning comes from perception, experience, feedback, and reflection. You first encounter something and perceive it with our senses (e.g., read, observe, hear, feel), experience something (e.g., movement, action, tension, emotion), which gives you feedback about the perception and experience that is synthesized through reflection (e.g., memory, comparison with related things, contemplation).

Evaluation is principally a tool for learning because it focuses our perception on things, monitors the experience, provides the feedback and can support reflection through offering a systematic, structured means to makes sense of what’s happened.

Evaluation is simply the means of answering the question “what happened?” in a systematic manner.

For those developing an innovation, looking to change, or seeking to improve the sustainability of our systems, answering ‘what happened?’ is the difference between real impact and nothing.

Mapping the journey with data

A journey map is a tool that is used in service design to help understand how service users (e.g., clients, customers, patients, students) might encounter the service and system to achieve a particular goal. These can be displayed visually with great artistry (see here for a beautiful example of the Indigenous cancer patient journey in BC) or simply with boxes and arrows.

It is one of many types of maps that can be created to illustrate the ways in which a user might navigate or approach a service, decision, or pathway to learning.

For innovators and evaluators, these tools present an opportunity to create touchpoints for data collection and deeper understanding of the service throughout. Too often, evaluation is focused on the endpoint or an overall assessment of the process without considering ways to embed opportunities to learn and support learning throughout a journey.

We feel this is a lost opportunity.

Without the opportunity to perceive, gain feedback, and reflect on what happens we are left with experience only, which isn’t a great teacher on its own and filled with many biases that can shift focus away from some of the causes and consequences associated with what’s happening. This is not to say that there isn’t bias in evaluation, yet what makes it different is that it is systematic and accounts for the biases in the design.

Service design meets evaluation

Design-driven evaluation is about integrating evaluation into the design of a program to create the means for developing systematic, structured feedback to support learning along a service journey. One of the simplest ways to do this is to build a layer of evaluation on the service journey map.

Consider a detailed service journey map like the one illustrating the patient journey map cited above. Along this windy, lengthy journey from pre-diagnosis to the end, there are many points where we can learn from the patient, providers, system administrators, and others associated with the health-seeking person that can inform our understanding of the program or system they are in.

By embedding structured (not rigid) data collection into the system we can better learn what’s happening — in both process and effects. Taking this approach offers us the following:

  • Identify activities and behaviours that take place throughout the journey.
  • Provides a lens on service through the perspective of a user. The same service could be modelled using a different perspective (e.g., caregiver, healthcare professional, health administrator).
  • Identifies the systems, processes, people, and relationships that a person goes through on the way through, by, or in spite of a service
  • Allows us to identify how data can fit into a larger narrative of a program or service and be used to support the delivery of that service.
  • Anchors potential data collection points to service transitions and activities to help identify areas of improvement, development, or unnecessary features.
  • Provides a visual means of mapping the structural, behavioural and social processes that underpin the program to test out the theory of change or logic model (does it hold up?).
  • Offers opportunities to explore alternative futures without changing the program (what happens if we did X instead of Y — how would that change the pathway?).

These are some of the ways in which taking a design-driven approach and using common methods from service design can improve or enhance our understanding of a program. Not a bad list, right? That’s just a start.

Try this out. Service design tools and thinking models coupled with evaluation can provide access to the enormous wealth of learning opportunities that exist within your programs. It helps you to uncover the real impact of your programs and innovation value hidden in plain sight.

To learn more about this approach to evaluation, innovation, and service design contact us. We’d love to help you improve what you do and get more value from all your hard work.

Photo by Lili Popper on UnsplashBilly Pasco on Unsplash and  Startaê Team on Unsplash . Thank you to these artists for making their work available for use.

Scroll to Top