Business Innovation Emotional Intelligence Innovation Innovation Process Rapid reasoning

How to Measure the ROI of Design Thinking

14th January 2017
Design thinking sprint

New research out of Stanford’s d.school identifies innovative ways to measure the impact of design thinking.

Have you ever worked in an environment where the sales team becomes irritated, even hostile, with engineering or marketing team? Or have you worked for an incompetent manager? Have you ever worked in a place in which petty politics creates friction and slows progress down?

design thinking

I am assuming here that you’ve said “yes”. Well, design thinking helps teams to cooperate and collaborate again to find better solutions. It’s that simple.

When clients ask how they should measure the return on investment (ROI) for using design thinking, it’s so important to communicate the quantitative and qualitative value.

Through experience, I know design thinking workshops increase productivity and performance: saving time and money. It helps product teams work more harmoniously and leads to better product outcomes. Measuring its impact requires a Butterfly Effect framework, one that reasons through the impact.

Nevertheless, how do you measure the effect on the bottom line by averting building out technology that would have failed to meet user needs? How do you measure the impact of establishing a culture focused on customers, rather than on efficiency and productivity alone?

Fortunately, new research offers clearer frameworks to measure the impact of design thinking in the context of business.

By Reasoning through Multiple Perspectives to Measure the Impact of Design Thinking

In 2015, a research team associated with Stanford’s famous d.school surveyed 403 design-thinking practitioners (most from larger, for-profit businesses). Their paper, titled Measuring the Impact of Design Thinking, affirmed that organizations continue to struggle in determining ROI. However, it also found that those most committed to the task recognized that design thinking can’t be measured as a single concept. (Köppen, Meinel, Rhinow, Schmiedgen, Spille, 2015)

The companies surveyed seem to acknowledge a Butterfly Effect from design thinking, and practitioners reported attempts to track it from a variety of perspectives:

Customer Feedback
—customer satisfaction, net promoter scores, response to specific campaigns, usability metrics, client feedback

Design Thinking Activities—number of projects, people trained, coaches trained

Quick Results
—concepts finished, projects launched, projects funded, projects in development

Anecdotal Feedback—evaluation forms, qualitative feedback at each stage of the design thinking process, surveys

Traditional KPIs—Increased Sales, ROI per project, and other financial measures

Culture—team efficiency, engagement, collaboration, motivation


Linking ROI to Business Drivers for Design Thinking

Earlier this year, Bernard Roth and Adam Royalty, two central figures from the d.school published separate papers, entitled Developing Design Thinking Metrics as a Driver of Creative Innovation. In it, they suggest ways beyond “execution oriented” metrics to those that would track “creative behaviors” instead.

First, they identified three main drivers leading companies to pursue design thinking:

1. To better understand customers or end users
2. To protect business share from disruption and startups
3. To develop more innovative methods and team dynamics

Then they devised new metrics for each driver.

empathy

Measuring Empathy

A key tenant of design thinking is cultivating empathy with customers/users to discover unanswered needs. The idea is that if we better understand needs, we can design better solutions and increase revenue or save money.

Roth and Royalty suggested the following metrics for measuring a project team’s empathy with customers/users:

* Track the number of days the team goes between observing or interviewing customers or users (with the goal of reducing time between interactions).

* Track the number of customers or user interactions over the life of a project.

* Track interaction back to user personas to measure the diversity of customer or user insight.

Measuring Business Value

Another focus of design thinking is creating innovative products and services that add value to organizations.

Roth and Royalty suggested measuring value and novelty of project outputs on a grid where the vertical axis endpoints are “Valuable” and ”Not Valuable” and the horizontal axis represents ”Novel” to “Not Novel”.

The goal of the measurement is to understand if design thinking projects are perceived as valuable to the company and if they take the company in new directions. (The authors recommend that team members vote anonymously and that scores are averaged to determine grid placement.)

Measuring Innovation

According to Rother and Royalty, an earlier study (Dow & Klemmer, 2011) showed that more iteration leads to stronger prototypes, and stronger prototypes lead to better products. So they proposed two ways to measure how well a team iterates on an idea.

Measure the number of prototype iterations per feature. Measuring per feature is important because it allows for comparison between projects, regardless of the size of the project or feature set.
Measure the number of concurrent prototypes. Another study (Dow et al, 2010) suggests that developing prototypes in parallel (rather than in series) results in stronger outcomes.

Getting Beyond Cost Savings to Show Broader Benefits

One of Rapid Reasoning’s clients, a large Silicon Valley tech company, began using human-centered design thinking two years ago.

Initially, they focused on traditional measurements of the cost savings created by more seamless digital content, systems, and infrastructure we helped them create for their engagement initiative. This measurement wasn’t directly attributable to the design thinking work alone, however, the user-centered processes we established clearly contributed to overall savings.

Once they could show the cost savings, the company was able to focus on how design thinking impacted other aspects of their work. Today, they focus less on reporting cost savings impacts and more on team efficiency and satisfaction. In the long term, these metrics are probably better indicators of how well the county provides services to residents. (This is especially pertinent as the county faces the prospect of a smaller workforce following a large trend of baby boomer retirements.)

Evolving Metrics that Work for Your Team

Software company Intuit is another example of an organization that’s embedded design thinking deeply throughout its culture and operations.

The 2015 study mentioned above (Köppen, Meinel, Rhinow, Schmiedgen, Spille, 2015) describes how the company evolved a story-based approach to evaluating its effectiveness. Intuit began with some of the metrics mentioned above but found that as design thinking regularly reframed challenges, the metrics also needed to be reframed. As a result, they now craft narratives that capture both qualitative and quantitative information into a broader evaluation of the business benefit.

Measuring the impact of design thinking is multi-faceted, however, it’s necessary to gain traction for the methods we practitioners see working day-in and day-out. The takeaway here is to start with an array of metrics that you know you can begin tracking immediately. Then tweak your measurement systems as your projects and processes evolve, always with the goal of proving long-term organizational value.

Want to learn more about how you can start measuring the impact of human-centered design? Or want to share what’s working in your company? Contact us at rapidreasoning.com

Sources:

Köppen E, Meinel C, Rhinow H, Schmiedgen J, Spille L (2015) Measuring the impact of design thinking. In: Plattner H, Meinel C, Leifer L (eds) Design thinking research. Springer, Switzerland, pp 157–170

Roth B, Royalty A (2016) Developing design thinking metrics as a driver of creative innovation. In: Plattner H, Meinel C, Leifer L (eds) Design thinking research. Springer, Switzerland, pp 171–183

Dow SP, Klemmer SR (2011) The efficacy of prototyping under time constraints. In: Design thinking. Springer, Heidelberg, pp 111–128

Dow SP, Glassco A, Kass J, Schwarz M, Schwartz DL, Klemmer SR (2010) Parallel prototyping leads to better design results, more divergence, and increased self-efficacy. ACM Trans Comput Hum Interact 17(4):18

You Might Also Like

No Comments

Leave a Reply