Tuesday, November 04, 2008

Comparing two customer research approaches

I had two remarkable experiences today.

First, I interviewed a marketing manager about some software he uses. He spent thirty-five minutes describing why the company chose the software, how he used it, how he learned to use the features over time and thereby developed proficiency in an area of marketing he hadn't known well before, how the supplier had given him very responsive support, how the user's group had helped him... and, by the way, three or four features that, if they existed, could really help him. I recorded everything and will review this and a number of other interviews with the client using narrative sensemaking approaches. In the end, they'll get a deep, detailed picture of how they're viewed by their customers. They'll know features that customers will value. And they'll know some things that bother their clients.

Later in the day, I got a survey to fill out. It looked like this:

Rate each question on a scale of 1-5, with 1 being Poor and 5 being Excellent.

* Trainer communicated in a clear, concise, and easily understood manner.


* Demonstrated that he is knowledgeable in [...].


* Displays pride, enthusiasm, and a positive attitude in his work.


* Demonstrates a professional attitude and supports the [client].


* Practice topics are clear and correct for [skill and experience].


* Trainers were timely and approachable with problems and concerns.


It's unfair, I know, to compare the two approaches. The first is more expensive and time-consuming. There is more at stake for the software company than for the second group, a nonprofit.

But, really, what can one possibly learn from the second approach? Isn't the interview method better in about 1,000 ways?

, , ,


NWGuy said...

It's unfair to compare the two approaches because they have different objectives. The first one is trying to perform a user study, though the interview is more opinion-based than asking the person to show what they do on a regular basis. Understanding a persona provides more reality into the equation than asking people what they want (Tuned In is a good reference for this).

The second approach was not evaluating a product but a person's performance. The questions could be phrases better but "did the trainer communicate well" is very different than "does the product help you".

John Caddell said...

Hi, NWGuy,

Hmmm... I didn't find the contexts so radically different. The survey could just have easily been from the place where I get my car serviced, or the last hotel I stayed at. In those and countless other situations, companies use surveys to try to hear the voice of the customer. And my belief is that you simply don't hear the voice of the customer with surveys.

Numerical ratings are limiting, and the questions often don't address salient issues that I have an opinion on, and which the surveyor should want to hear about.

In two recent instances, I encountered a company that followed up on a survey with a phone call to learn my opinions in more detail. I'd like to see more of that.