Cense Ltd.

Inspiration, Innovation and Impact

  • Who We Are
    • Contact
    • Principal & President
  • Our Services
    • Strategic Design
    • Evaluation Services
    • Well-Being Design
    • Coaching + Training
    • Chief Learning Officer Service
  • Innovation Kit
  • Censemaking
  • Academy
  • Events
  • Inspiration | Innovation | Impact

Chief Learning Officer

2019-09-13 by cense

C-suite leadership roles focus on an organization’s most important functions. Time to introduce the role of the Chief Learning Officer.

Learning — easier said than done. Yet, learning is vital to the success of an organization that seeks to innovate to gain advantage or merely survive – which is most human service organizations these days.

Learning opportunities abound, yet these require energy and attention in order to take advantage of them. Organizationally, this requires leadership and resources to support people across the organization to learn within their areas of focus and across the institution and networks.

Everyone is responsible for learning and there are some great resources to support that effort, but without someone taking explicit leadership on making sure learning happens within the institution, it’s less likely to happen — at least happen in a way that is designed for innovation.

Introducing: the Chief Learning Officer.

Leading Learning

With the alphabet soup of C-s that we are seeing among organizations’ leadership teams adding another might not seem helpful. What we propose is less about formalizing the title of CLO and more about creating the function of what they can do within an organization.

We envision a CLO role as one that does the following:

  • Establishing a learning plan for the organization and the data structure to support that learning. This means instilling and building a culture of evaluation across the organization, which provides data and feedback on what is happening to allow staff at all levels to learn from what is being done. It involves showing what evaluation can do and co-creating ways to do it across the enterprise to add learning value.
  • Ensures that staff roles and functions include the ability to study and reflect on the work being done and its impact. This means establishing practices and procedures that link evaluation data to program activities. It also involves creating the means to bring in insights from outside sources (e.g., published research and reports, networks, professional communities, customers and clients). Structuring what we do and how and when we do it is part of this function to ensure that roles and learning needs are fit-for-purpose.
  • Organizing the evaluation of program activities and ensuring that staff at all relevant levels of the organization close to each program have access to the information about those programs and can make decisions about those programs without having to go through cumbersome layers of bureaucracy.
  • Create sensemaking channels and opportunities throughout the organization. This allows intra- and cross-departmental/unit collaboration to understand the bigger picture of what’s going on in the organization and industry.
  • Supports the development of self-sustaining communities of practice or learning groups on topics relevant to the organization, yet without specific roles or functions. Topics might include emerging technologies, leadership, creative thinking, or professional development.

Creating your CLO Office

A CLO would link the activities of the organization to the monitoring and evaluation data about those activities with the literature and trend data from outside the organization into a culture of learning within the organization.

This could be a full- or part-time position or something like a fractional CLO role like we can play at Cense.

Whether you create a CLO within your organization or choose to recruit learning support from outside, having a dedicated person shepherding your culture into a learning organization is something that will increase your innovation capacity exponentially.

For more information about how you can build this learning culture within your organization or the fractional CLO role, contact us. We’d love to help you out.

Filed Under: Learning, Research + Evaluation Tagged With: culture of evaluation, developmental evaluation, evaluation, learning, organizational change, organizational learning

Resourcing Developmental Evaluation: The First Trap

2018-05-14 by cense

In a recent post on our Developmental Evaluation (DE) in their work. In this second in our series on DE and its traps, we look at the issue of resourcing.

Like the image above, resourcing looks relatively simple at a distance, through a frame that constrains what you see in front of you. Most evaluations are designed through this kind of perspective. Developmental Evaluation requires a different kind of frame and the resourcing to support its deployment.

We see four keys to resourcing a DE.

Money

Let’s first get the money issue out of the way first.

Evaluation budgets are recommended to be set at 5 – 10% of a programmatic budget, although it’s commonly acknowledged that real budgeting falls often falls far short of this in practice. Indeed, some suggest the percentage of expenses should be higher, while The Hewlett Foundation recently published a benchmarking study where it found it was able to spend much less on its evaluation to achieve impact (largely due to its size and internal capacity).

Whatever percentage an organization seeks to devote to its evaluation budget, that budget almost certainly must increase for a DE. The reason has to do with the additional human and time resources and related coordination costs associated with doing DE work. The methods and tools associated with a DE may not differ much from a conventional evaluation, but their deployment and the kind of sense-making that comes from the data that they generate is what sets it apart and require resources.

We are reluctant to recommend a firm percentage number, rather we encourage prospective evaluands (i.e., clients, partners) to consider the question:

What are you hiring a Developmental Evaluation to do for you?

This gets at the matter of purpose. DE is about supporting strategic decision-making for innovation, thus the true budget for implementing a DE must include the time, personnel, and associated resources that can support the integration of what emerges through a DE into the program, service, or product developmental stream. That requires energy, focus, and that adds to the total cost.

We suggest you consider doubling your budget considerations at a minimum, although keep in mind that some of that cost might come from budget lines previously earmarked for strategy development, program development, and design so the overall value of DE is likely far higher than that of a standard program evaluation.

Time

While the cost of a DE might be higher, the biggest ‘expense’ might be time. Some programs like the idea of a ‘plug-and-play’ evaluation where an evaluator designs and evaluation and then largely disappears once it’s been approved and initiated. That doesn’t work for DE.

DE requires participation from the program staff, management, and potentially other stakeholders to work. An external consultant cannot, no matter how skilled or knowledgeable, provide all the answers to the questions that a DE will generate and make the decisions necessary to develop a program or process. Questions will come from the data collected — usually in some mixed method form — that illuminates some emergent, dynamic properties in a program that, because of their very complexity, require diverse perspectives to understand. This requires that those perspectives be gathered, facilitated, and integrated into a set of decisions that can be taken forth.

This time requirement will need to match the type of data collected in the evaluation design, the regularity of the meetings, and the complexity of the context over and above what is necessary to gather, analyze, and otherwise manage the data.

The biggest reason why organizations fail at their DE efforts and to innovate more broadly is that they do not invest in the time required to truly learn.

Expertise

Who is best suited to your evaluation? Do you bring in an external consultant or stick with an internal evaluator? We recommend both, together. External evaluators bring the emotional and perceptual distance to see patterns and activities within the organization that may be ‘hidden in plain sight’ to those within the organization. However, this distance also means that the external evaluator likely misses out on some of the nuance and context that is absolutely critical to supporting a DE — something that a well-placed internal evaluator might find accessible.

DE is relatively new and the processes, tools, and outcomes associated with sustained innovation are also relatively unfamiliar to many organizations. Add to this the need for ongoing learning and organizational development through the process of DE and it becomes easier to see why bringing in a skilled consultant matters. DE requires expertise in evaluation methods and tools, group facilitation, behavioural science, practical knowledge of complexity, design, and organizational development — whether within a person or group.

However, what makes DE most effective at supporting innovation is a sustained commitment to nurturing these skills and abilities within the organization. For that reason, we recommend bringing on consultants who can not only do DE but help their clients to learn DE and build their capacity for learning through evaluation and for evidence-informed design.

If an organization can build those capacities, they are set up for innovation success (which includes learning from failure).

Focus

Time, care (expertise), and attention are the three non-financial resources required for DE and innovation. Focus — the application of attention to something — cannot be marshaled periodically for DE to be a success; it has to be ongoing. While DE does involve specific milestones and benchmarks, it is principally about viewing a program as a living enterprise. Thus, one has to treat it like having a pet that requires continued engagement, not as a fence that requires a coat of paint every few years.

Ongoing learning happens by design. As an organization, you need to provide the means for your staff and leadership to truly learn. We have consistently encountered ‘knowledge organizations’ for which their primary product is new knowledge where staff and senior management spend less than an hour per week on activities such as reviewing literature, training, or synthesizing educational material. That is an absence of praxis — learning while doing — and a lack of focus on what the organization is all about.

To do DE requires a focus on learning, commitment to using data to inform decisions, and a willingness to deal with uncertainty. These requirements are not one-time events but must be managed as part of an ongoing process as part of building a culture of evaluation.

Resources for resourcing DE

Are you resourced for Developmental Evaluation?

If you are or require some help in preparing for one, contact us and we can show you new ways to build your capacity for innovation while expanding your possibilities to learn and grow.

 

Filed Under: Research + Evaluation Tagged With: culture of evaluation, design, developmental evaluation, evaluation, innovation, innovation design, resources, strategy, time

Design meets evaluation: AEA 2016

2016-04-11 by cense

Design + Evaluation
Design + Evaluation

The theme for this year’s annual Cameron Norman, to bring greater understanding and dialogue between what is created and how its value and impact is assessed. Speaking on the need for this focus, Gargani adds:

“Everything we evaluate is designed. Every evaluation we conduct is designed. Every report, graph, or figure we present is designed. In our profession, design and evaluation are woven together to support the same purpose—making the world a better place. This is the inspiration for the 2016 theme: Evaluation + Design. In 2016, we will be diving into this concept looking specifically at three areas—program design, evaluation design, and information design” – AEA President, John Gargani.

Sheila Robinson, an evaluator who also runs the AEA’s amazing AEA365 daily blog on evaluation research and practice, recently wrote on the need to create a culture of evaluation within an organization for it to be successful.

Together, the idea of bringing together design and a culture for letting design flourish with evaluation is compelling and something we’ve encouraged in our client work.

It’s one thing to argue for design as a competitive advantage and evaluation as a means for building, sustaining and enhancing performance, but it’s far more difficult to build cultures that nourish and sustain both good design and evaluation. Yet, it is the integration of design and evaluation that is the key to innovation and doing it sustainably.

If you’re interested in learning more about this and the issues associated with design and evaluation, consider attending the upcoming conference in Atlanta and following the AEA365 blog (and this one) in the months building up to this event. And for more information on how to build a culture of evaluation and design in your work, contact us at: info @ cense.ca.

Photo credit: Design by Piotr Mamnaimie used under Creative Commons License via Flickr.

Filed Under: Design, Research + Evaluation Tagged With: American Evaluation Society, Atlanta, Cameron Norman, culture of evaluation, design, evaluation

Search

  • Who We Are
  • Our Services
  • Innovation Kit
  • Censemaking
  • Academy
  • Events
  • Inspiration | Innovation | Impact

Copyright © 2022 · Parallax Pro Theme on Genesis Framework · WordPress · Log in