May 17th, 2013
If you have ever been called to participate in a phone survey, you probably know the routine where you hear a question and then jump in with an answer. A good interviewer will remind you that she needs to read the entire question and all the answer options just to be sure that you offered the best response option.
We can’t do this with self-administered online surveys, but there is a way to minimize error associated with respondents jumping to conclusions: Put all clarifying instructions before asking the question, not after. A recent study published in Public Opinion Quarterly documented that if you put instructions before the question, respondents spend more time answering because they are reading the instructions and answering more carefully. Not surprisingly, their answers are more accurate.
Here is an example of a typical question that has clarifying instructions after the question: Read the rest of this entry »
May 9th, 2013
I rarely miss the world of academia after having jumped ship twelve years ago, but one thing I always appreciated was that teaching involved distilling ideas down to their simplest and most essential form. Students often led the way for me, asking questions that forced me to clarify, deepen, and condense. So it was again in Professor Alan Malter’s market research class for MBA students last month at the University of Illinois in Chicago.
My annual task is to offer a business-side view of market research. But this year I asked students to imagine that you, dear readers—Versta Research’s valued clients—were standing in front of them instead of me. What would they want to ask you about your work and about client-side research?
Read the rest of this entry »
May 1st, 2013
It’s our job to deliver bad news as well as good news, right? To tell clients what they’re doing wrong so they can fix their problems and leap to the next level of profitability, right? Why would they spend money collecting data if they just wanted to hear how much customers love them? In fact, why would they want to hear how much customers love them, if the research says otherwise?
A recent study published in the Journal of Consumer Research suggests some answers that may surprise you. Read the rest of this entry »
April 26th, 2013
It is hard to find an appropriate use for Google Surveys, because, as we outlined in a review article last fall, its capabilities are limited. But last week we needed a quick incidence test of how many U.S. adults own a certain type of investment product. Google Surveys seemed perfect. It was not fast, by the way. It took five days to collect data from 200 respondents. Google says this is because we asked a screening question before asking about product ownership. Even so, this survey took longer than a standard omnibus.
But what struck me most about my trial run with Google Surveys was the Creepy Factor. It made me realize in a most uncomfortable way that Google tracks everything I do. I knew this already, and I follow ongoing discussions about online privacy. I have a personal g-mail account, a G+ page, and I use Google as the starting point for almost everything I do on the Internet. I know that they track everything I do. But it was never so creepy and apparent until I fielded a Google survey. How was it creepy? Read the rest of this entry »
April 19th, 2013
We’re all for ditching research when you have strong intuitions about your business and have no pressing questions that need to be answered. But too many business leaders are now following in the footsteps of Steve Jobs who was notorious for believing that consumers can’t tell you what they want. Aggressively ignoring market research has become a strange point of pride for some.
Did Ron Johnson, formerly of Apple and now formerly of J.C. Penney, do any research before running his new company into the ground? It’s doubtful. Even a super simple research effort like watching people enter stores from parking lots (“Look, they’re all clutching coupons!”) might have given him some insight about how his new strategy would fare. In fact he didn’t even have to do research. He could have asked his managers to review research published over the last twenty years to help him understand how consumers respond to various pricing strategies.
Here is an excellent resource that should have been his starting point: Read the rest of this entry »
April 11th, 2013
This is my dog Lancelot on his first day at his new home
If you have ever done “clicker training” with a dog, you know how amazingly effective positive rewards are in training, versus the old-fashioned method of “correction” and negative feedback. Identify and reward the behavior you want, and you can teach an old dog new tricks within hours. It works for people too, which is why so many HR and business experts talk about the power of praise in teaching and motivating employees. I can relate to this. I love my work most when our clients offer generous praise and tell us that we exceeded their expectations.
But I also worry: Is the work really good? Is it as good as it can be? Is it finding its way up to other managers and decision makers, and is it helping them, too? Maybe this reflects professional insecurity, but new research published in the Journal of Consumer Research shows it is common as people gain expertise in their fields. Quoting the study published a few months ago: Read the rest of this entry »
April 4th, 2013
Some business managers and marketing executives mistakenly believe that “big data” will deliver better insight because of the sheer volume of data now at our disposal. Now we just need the statisticians, the computing power, and the analytics software to sift through it all, right? Not so. The truth is, for most purposes you don’t need a lot of data. You need a small random sample of data. Read the rest of this entry »
March 28th, 2013
Or better yet, we might say “Why GOOD Research Takes So Long.” Our answer (before you feel inspired by the video embedded below) is that good research is creative and thoughtful and takes a great deal of intellectual energy at several crucial points:
- It puzzles over multiple ways (including data sources and methods) to get novel answers
- It thinks “behind” what managers are asking to figure out better questions that have unknown answers
- It experiments with new techniques of data collection that might not be standardized
- It toys with data and investigates multiple approaches for analysis
- It turns data into stories that get revised and refined so that managers see the answers in ways that matter
Read the rest of this entry »
March 20th, 2013
This week we published Versta Research’s quarterly newsletter with a feature article entitled “How to Design an Excellent Chart.” It addresses one critical piece of turning research data into compelling stories by focusing on the process of data visualization. For us, that process involves five steps:
Read the rest of this entry »