Updates to Evaluating Social Innovation Prototypes: A Guide

Photo from Evaluating Social Innovation Prototypes: A Guide. The Experimental Niche for Prototyping. 

By Mark Cabaj

“Evaluating Social Innovation Prototypes: A Guide” is a practical resource for social innovators, and their supporting facilitators and evaluators, who want to make more effective use of prototypes in order to address complex social challenges.

In this follow-up blog, Mark Cabaj sheds light on the latest updates to the Guide, reasons behind the changes and the impact they bring to prototyping effectiveness.

Why did you decide to make updates to the Guide so quickly?

We were committed to modeling one of the basic principles of innovation: develop something that is ‘roughly right’, get it out there, pay attention to the reaction to it, and upgrade whatever you’ve originally produced. Its usually a policy proposal, a new model of decision-making, or a new technology. In this case, it was an evaluation guide.

How was feedback collected, and how has it influenced the changes made? What kind of challenges did the users face to prompt these updates?

It was spontaneous and informal. Several people sent me emails directly. A few of the people who provided input the guide gave me a call. I also did a google-alert to monitor where and when people reacted to the Guide on-line. It was hardly scientific nor exhaustive, yet still very effective.

Can you elaborate on the updates made in regard to the experimental niche for prototyping on page 8? (Diagrams)

In her on-line reflection of the Guide, a veteran social innovator pointed out that we made an ‘unfortunate’ adaptation of the now-classic NESTA Innovation swirl that identifies the important niche role that prototyping places in the process of innovation: we neglected to describe how many (though not all) social innovation processes begin with some type of ‘framing’ of the challenge to be addressed and key features that social innovators should consider in exploring, developing and testing solutions.

I agree it was a miss. It is important to acknowledge that many social innovation prototypes emerge in response to an (in)formal framing process and/or challenge statement. We should show that. Why? Because it shapes (1) what kind of solutions get developed and (2) the kind of criteria that social innovators should consider when evaluating their possible solutions. For example, if a group frames the innovation challenge as dramatically improving the equitable uptake of a certain community in accessing the Canada Learning Bond, then ‘equitable access’ – or something like it – should be a key criteria for assessing the effectiveness of promising
prototypes. So, we upgraded the diagram and referenced the framing process several times throughout the
rest of the Guide.

Were there any significant changes to the overall process used for evaluating prototypes? If so, what influenced these changes and how do they add to a more thorough evaluation process?

No, not the overall process. But we did add an important clarification on the niche application of the approach we laid out in the section, How to Use This Guide (page 5), by adding a simple paragraph:

This guide is also for change-makers who also operate with a very structured, design-oriented approach to social innovation prototyping (see page 9), which invites a similar approach to evaluation. It does not reflect a more organic, improvisational approach to prototyping often used in social change efforts, which require a more open and flexible approach to learning and evaluation.

It may seem trivial. Its not. In fact, its been on my mind ever since Frances Westley – a giant in the social innovation field – pointed out to me several years ago that so much social innovation is organic, not designed, and that the evaluation field had to account for that. I know this is true. However, I – and many of the people who provided input to the Guide – are heavily influenced by a strong planning or design orientation to social innovation and defaulted to that orientation when created the Guide.

This does not mean that the Guide we offered is wrong. In fact, I am more confident than before that it helps fill an important gap in our field with a comprehensive set of useful ideas and practice. Its just important point out that the Guide reflects that a strong design – rather than organic or emergent – orientation to the work and people should keep that in mind when they use it.

There have been additional resources added to the appendix. How do you believe it enhances the overall value and effectiveness of the guide for its users?

The more resources, the better. People like to see how big ideas translate into practical action. I encourage people keep their eyes open to spot more of them and create their own inventory.

What is next chapter for evaluating social innovation prototypes?

There is so much more to do! We clearly need to work on how best to evaluate social innovation prototypes that are part of a more organic social innovation process. I – and others – are also working on resources related to ‘portfolio’ evaluation, social innovation labs (a bigger platform for social innovation prototypes) and how to better integrate practical ways to integrated equity, diversity and inclusion into the evaluation of social innovation initiatives.

Read the first blog about the Guide here.

Watch the recording of Evaluating Social Innovation Prototypes webinar. (February 20, 2024)

View the presentation slides.

Related Stories