June 13th, 2013
Even if you don’t care about political polling, or the fact that Gallup consistently overestimates support for Republican candidates, it is worth paying attention to how Gallup is trying to fix its problems with surveys and polling.
They are not happy with how poorly their polls have fared (who would be?), and they have teams of smart people trying to figure out what is wrong. Given their high profile, they are making the process and findings of their investigations public, and we have much to learn from that.
Last week they released their findings from an extensive review, which involved outside experts as well as internal ones. It is fascinating to read, because they identify 13 suspects in their survey process that all companies who do survey research should always be thinking about: Read the rest of this entry »
June 6th, 2013
The Excel coding error, courtesy of The Roosevelt Institute
Or maybe we should say: why vendors SHOULD do everything twice BEFORE their work hits your desk and you send it back because you found errors. When it comes to something as complex and exacting as market research or public opinion polling, there are almost certainly mistakes the first time around. If a company does not have processes to validate data and deliverables, unfortunately those mistakes end up with you.
Recent errors in an economics paper that laid the foundation for Europe’s austerity programs provides a dramatic and painful example, as outlined by Paul Krugman of the New York Times: Read the rest of this entry »
May 30th, 2013
As market researchers pay more and more attention to the need for compelling data visualizations, Hans Rosling’s work with interactive data is becoming a catalyst for a fascinating new type of data visualization: animated statistical graphs. He uses them to show world social and economic trends over time, and the effect is exceptionally powerful. Here is a video of his TED talk well worth watching, in which the first eleven minutes are focused on interactive graphs:
Others are following his lead, though not always to the same superb effect. Here is another animated statistical graph, this one documenting drone strikes and fatalities in Pakistan: Read the rest of this entry »
May 17th, 2013
If you have ever been called to participate in a phone survey, you probably know the routine where you hear a question and then jump in with an answer. A good interviewer will remind you that she needs to read the entire question and all the answer options just to be sure that you offered the best response option.
We can’t do this with self-administered online surveys, but there is a way to minimize error associated with respondents jumping to conclusions: Put all clarifying instructions before asking the question, not after. A recent study published in Public Opinion Quarterly documented that if you put instructions before the question, respondents spend more time answering because they are reading the instructions and answering more carefully. Not surprisingly, their answers are more accurate.
Here is an example of a typical question that has clarifying instructions after the question: Read the rest of this entry »
May 9th, 2013
I rarely miss the world of academia after having jumped ship twelve years ago, but one thing I always appreciated was that teaching involved distilling ideas down to their simplest and most essential form. Students often led the way for me, asking questions that forced me to clarify, deepen, and condense. So it was again in Professor Alan Malter’s market research class for MBA students last month at the University of Illinois in Chicago.
My annual task is to offer a business-side view of market research. But this year I asked students to imagine that you, dear readers—Versta Research’s valued clients—were standing in front of them instead of me. What would they want to ask you about your work and about client-side research?
Read the rest of this entry »
May 1st, 2013
It’s our job to deliver bad news as well as good news, right? To tell clients what they’re doing wrong so they can fix their problems and leap to the next level of profitability, right? Why would they spend money collecting data if they just wanted to hear how much customers love them? In fact, why would they want to hear how much customers love them, if the research says otherwise?
A recent study published in the Journal of Consumer Research suggests some answers that may surprise you. Read the rest of this entry »
April 26th, 2013
It is hard to find an appropriate use for Google Surveys, because, as we outlined in a review article last fall, its capabilities are limited. But last week we needed a quick incidence test of how many U.S. adults own a certain type of investment product. Google Surveys seemed perfect. It was not fast, by the way. It took five days to collect data from 200 respondents. Google says this is because we asked a screening question before asking about product ownership. Even so, this survey took longer than a standard omnibus.
But what struck me most about my trial run with Google Surveys was the Creepy Factor. It made me realize in a most uncomfortable way that Google tracks everything I do. I knew this already, and I follow ongoing discussions about online privacy. I have a personal g-mail account, a G+ page, and I use Google as the starting point for almost everything I do on the Internet. I know that they track everything I do. But it was never so creepy and apparent until I fielded a Google survey. How was it creepy? Read the rest of this entry »
April 19th, 2013
We’re all for ditching research when you have strong intuitions about your business and have no pressing questions that need to be answered. But too many business leaders are now following in the footsteps of Steve Jobs who was notorious for believing that consumers can’t tell you what they want. Aggressively ignoring market research has become a strange point of pride for some.
Did Ron Johnson, formerly of Apple and now formerly of J.C. Penney, do any research before running his new company into the ground? It’s doubtful. Even a super simple research effort like watching people enter stores from parking lots (“Look, they’re all clutching coupons!”) might have given him some insight about how his new strategy would fare. In fact he didn’t even have to do research. He could have asked his managers to review research published over the last twenty years to help him understand how consumers respond to various pricing strategies.
Here is an excellent resource that should have been his starting point: Read the rest of this entry »
April 11th, 2013
This is my dog Lancelot on his first day at his new home
If you have ever done “clicker training” with a dog, you know how amazingly effective positive rewards are in training, versus the old-fashioned method of “correction” and negative feedback. Identify and reward the behavior you want, and you can teach an old dog new tricks within hours. It works for people too, which is why so many HR and business experts talk about the power of praise in teaching and motivating employees. I can relate to this. I love my work most when our clients offer generous praise and tell us that we exceeded their expectations.
But I also worry: Is the work really good? Is it as good as it can be? Is it finding its way up to other managers and decision makers, and is it helping them, too? Maybe this reflects professional insecurity, but new research published in the Journal of Consumer Research shows it is common as people gain expertise in their fields. Quoting the study published a few months ago: Read the rest of this entry »