I’ve been fascinated by the Jobs to Be Done approach to product design for a while now, and something that I hear quite regularly as I talk to other folks about JTBD is:

“I think I get it. But how exactly do I apply it?”

Tony Ulwick’s book, “What Customers Want” provides some answers to that question. That book is an absolute gold mine of actionable advice and insight for product designers and entrepreneurs. If you haven’t read it, get it now; you won’t regret it.

In this post, I’ll show you step-by-step how I’ve used the approach described in that book to help my clients get a much better handle on what to build, what to improve, what to leave alone, and what to ignore.

If you’ve ever found yourself in a heated debate with your team, your boss, your co-workers, or your own head and thought, “There’s got to be a better way!” This post is for you.

Interested in Learning Jobs to Be Done? Subscribe to my free JTBD Email Course.

Background

One of my clients brought me in to do some customer research and define the scope for v1 of a new product they’re designing. So where do you start, right? There are a hundred things the app could do … but what should it do first? Sound familiar?

I’ll go into the details for each part of the process I used to identify their most pressing customer needs, but for the skimmers, here’s the gist:

  • Run some customer interviews to gain a better understanding of the customers’ Job to Be Done.
  • Split the Job to Be Done into steps.
  • Survey the importance and satisfaction of each step of the Job to Be Done.
  • Visualize the survey results with a little Rails app I’ve built for this type of survey.

Measuring What Customers Want

The visualization of the survey data is what I’m most excited to share, but to get there, we have to begin with the Job to Be Done.

For example, let’s say that an example of a Job to Be Done for your customer is:

“Share beautiful pictures”

and through your interviews, you identify a small piece of the Job that is:

“Crop the photo.”

Well, how do we know if “Crop the photo” is something we should even care about?

The answer is to ask users a set of paired Importance/Satisfaction questions:

Think for a second about what the answers to those two questions would tell you.

If something is not important and they’re satisfied with how they’re currently getting it done, there’s not a lot of growth opportunity in that particular feature. Don’t waste your time.

But what about something that is extremely important to a majority of your customers and they’re very unsatisfied with the current solution? Ding ding ding!

Think about how much easier grooming your backlog would be with this information. All those hours wrestling with guessing about what should or shouldn’t be part of your MVP would at least be reduced.

Because:

  • Steps that are unimportant can be pushed down the road or scrapped.
  • Steps that are important and currently satisfied are must-haves that don’t need to be re-imagined.
  • Steps that are important and unsatisfied … well now we’re talking! These are opportunities to delight existing customers, reduce support requests, and gain new customers.

Visualizing the Responses

I used Typeform to administer the survey and built a Rails app that connects to Typeform’s API to collect the response data.

Once the survey had been sent out and customers responded, my app spit out the following chart:

The colors here represent a feature category for this particular client. You don’t have to include this layer of visualization in your implementation, but check out the yellow feature. Three of the four questions about this feature’s functionality are extremely important and currently very unsatisfactory. What’s that tell you?

From our “Share beautiful pictures” example at the beginning of the post, one dot would be “Crop the photo”. Others might be “Take a photo”, “Choose a filter”, “Select a photo from my library”, and so on.

What Do the Quadrants Tell Us?

Low Importance: Don’t care.

High Importance and Highly Satisfied: Must-haves. Don’t mess it up.

Important and currently unsatisfied: Opportunity!

That is a very simplified explanation of the chart, there’s more granularity in there than just three categories. If you’re familiar with this methodology, you already know that; if you’re not familiar with this, let’s proceed one step at a time and build as we go. 🙂

Prioritization Just Got Easier

Imagine that prior to conducting this type of survey, you had those 29 ideas on your whiteboard. “What should we build first?” you wonder aloud to yourself or your team.

Now take a look at that chart and ask yourself the same question:

“What should we build first?”

A little bit easier, right?

So how do you get there? Let’s dig in.

Interested in Learning Jobs to Be Done? Subscribe to my free JTBD Email Course.

The Process

  1. Interviews
  2. Surveys
  3. Analysis

1. Interviews

First, we conducted sixteen customer interviews that ran about 45-minutes each. We walked away with a pile of quotes, thoughts, validations, and invalidations that we would carry with us into our analysis of the survey we’d be sending during the second phase of the research.

The goal during these interviews is to understand the customer’s desired outcomes when they’re using your product so that you can ask an Importance/Satisfaction pair against each outcome and plot all their positions on the chart we saw above and your prioritization woes will be greatly diminished.

During your customer interviews, many of the desired outcomes and steps your customers share will be completely expected.

But, you’re also very likely to be shocked along the way and this is one of the most valuable things about doing these interviews.

If you’ve got an existing product I can almost guarantee that your customers are using it in ways you didn’t expect or design for. When you stumble on these instances, don’t let them slip away! Figure out why they’re “misusing” your product and hacking their way to a better experience. There’s probably something valuable in there!

How to Record Your Customers’ Desired Outcomes

When your interviewee describes a step in their workflow, your goal is to understand what progress looks like when they’re done with that step.

The folks at Strategyn call this a “Desired Outcome Statement” and they’ve got an extremely robust sentence structure for framing this progress.

Your customer wants to:

Minimize or Increase,

the Time, Cost, or Likelihood,

of The Thing.

Strategyn calls “the thing” the “object of control”, but I don’t know, that feels a little buttoned up for me.

So, if a customer using your photo sharing app to “Share beautiful pictures” tells you that the next thing they’d do after taking a photo is to “crop the picture”, don’t say:

“Is cropping important to you?”

or

“If we removed cropping, how would you feel?”

While answers to those questions would give you some new information, we want to get new and specific information.

Dig in and try to find the outcome they’re after when they’re cropping the photo. That’s how you’ll discover if there’s an opportunity to improve the workflow and build a better product!

Ask, “Why do you usually crop your photos?”, let them respond … “Well, usually I’m just trying to make the composition look better” … and then reframe their response using the structure above:

“Ah. So, you’re cropping the photo to increase the likelihood that your photo’s composition is visually appealing? Is that right?”

And then Bingo. You’ve got the makings of a paired question for your survey.

  • “How important is it to increase the likelihood that your composition is visually appealing?”
  • “How satisfied are you with the visual appeal of your compositions?”

If the responses come back as very important and very unsatisfied, now you’ve got a chance to brainstorm some interesting options with far more impact than, “Do we need a crop tool?”

Here’s a handy little PDF you can download that I like to have with me on these calls. It makes it a lot easier to keep a record of all the outcomes you’ll collect on your calls.

2. Surveys

You’re going to come out of your customer interview process with a ton of quotes, stories, and ideas; and equally important, you’ll have built up a list outcome statements that will make up your survey.

Build Your Survey

Typeform is far and away my favorite survey tool on the market. The UX for participants is unparalleled and their API for retrieving survey responses is stellar.

Create each pair of Importance/Satisfaction questions using the settings seen below:

  • Question type: Opinion Scale
  • Start scale at 1: true
  • Steps: 5
  • Show labels: true
  • Left label: Not important
  • Right label: Very important

Devs and Technical Founders: You can save a ton of time by using Typeform’s API instead of creating every single question by hand. Create one set of paired responses like you see above, then retrieve the form, use your favorite code editor to duplicate and edit the questions, then update the form with the new questions.

Ship that Puppy!

Have a couple friendlies fill out the survey after you’ve got all your questions loaded in. You want to be sure that the results are being saved and having some extra sets of eyes to proofread your work never hurts.

Once you’ve confirmed that it’s good to go, get it in front of your customers or prospects and start collecting results.

For my client’s project, we sent the survey to everyone we’d interviewed as well as several dozen additional customers to ensure we had a reliable set of results.

3. Analysis

After you’ve begun collecting responses to your survey, you’re ready to begin analyzing the data.

Regardless of the tool you use to do this piece of the work (Excel, web app, R), you determine the X and Y coordinates of each outcome statement using this calculation:

  • X-axis: % of respondents who rated this outcome a 4 or 5 in Importance * 10
  • Y-axis: % of respondents who rated this outcome  a 4 or 5 in Satisfaction * 10

For example:

In the image above, we’ve got seven imaginary responses to the question about making the compositions more visually appealing in our “Share beautiful pictures” app.

3 out of 7 people responded with a 4 or 5 regarding its importance, which is roughly 0.43. We multiply that by 10 to give us a single digit coordinate on the chart of 4.3.

Perform the same calculation for satisfaction and we arrive at 7.1.

This would mean that we’d plot this particular outcome statement right about here:

Repeat this calculation for all of your outcome statements and you’ll be much more equipped to start making some decisions around roadmap prioritization.

A Few More Thoughts on Creating the Chart

If you’re a technical founder or if you’re working with a development team, feel free to get in touch with me and I’ll share some more details about my implementation.

Typeform also allows you to export your survey data as a CSV, so if you’re an Excel wiz or have one on your team, this could be a great route for you to pursue.

The benefit of using the API vs exporting the CSV is that an application calling the API will always remain up-to-date, whereas, the spreadsheet will only be updated each time you download the CSVs.

I am not great with Excel, so there may be a better way to do this, but the formula I used to create the example table above is:

=((COUNTIF(range,“>3”))/ROWS(range))*10

Obviously, replace “range” with the actual range of your results (i.e. A3:A8). But, that calculates the percentage of responses greater than three, divided by the total number of rows in the range, multiplied by 10.

Conclusion

As you can imagine, with all of the feedback we’d collected from the customer interviews combined with the analysis from the surveys, my client was able to make much more informed decisions about where to spend their time, money, and energy.

This was quite a lot to take in, I know. If you’ve got any questions at all, please reach out and I’ll be happy to help in any way I can.

Interested in Learning Jobs to Be Done? Subscribe to my free JTBD Email Course.

Join the Conversation

2 Comments

Leave a comment

Your email address will not be published.