March 15th, 2017
If you are a market research professional, I urge you to look for continuing education beyond the MR industry groups such as the Insights Association, Quirk’s, etc. and instead look to a more academic organization like AAPOR, the American Association for Public Opinion Research.
Yes, their conferences will be the most boring you ever attend. (Ever been to an academic conference?) And their positions on issues facing the industry are painfully slow to evolve. (It took the organization years to accept the idea that online polling can work.)
But there is substance and depth to everything that AAPOR does. No hot trends or fading fads. Just enduring, substantive issues that are the core of what we do in market research. Read the rest of this entry »
March 8th, 2017
At first I thought this was the dumbest survey question ever.
Why ask people what device they are using to take the survey, when you know from the platform meta-data what device they are using? Yes, even lame self-service survey platforms can (and should be) capturing information about device, operating system, IP address, start time, exit time, and a host of other data that most people ignore (and should not ignore!) because it does not seem relevant to their survey.
Then I realized (well, hoped) that our friends at Amtrak were actually geniuses. Read the rest of this entry »
March 1st, 2017
It amazes me that so many marketing research surveys are not responsive and optimized for mobile devices. I understand why they are not optimized, as decent mobile surveys are hard to design, and no, all the junk platforms that tout mobile optimization are not good options. But if you’re hiring a research or consulting firm, shouldn’t they be at the forefront of making your survey easy and accessible for the people you rely on (your respondents!) to supply data?
The answer is yes. And the other thing they ought to be doing is helping you monitor academic research-on-research to keep abreast of how mobile capabilities affect surveys, and therefore how they affect your data and research findings.
Here are some of the latest nuggets of insight we came across in a recent issue of Survey Practice: Read the rest of this entry »
February 22nd, 2017
It was so weird, right after the election, to see a media frenzy heaping praise and attention on the one poll that got it wrong. The USC/LA Times poll predicted a Trump win, with 47% of voters choosing Trump and 44% choosing Clinton. Right after the election, media headlines trumpeted its success:
“How One Pollster Saw Trump’s Win Coming”
“The USC/LA Times Poll Saw What Other Surveys Missed”
“What Can Be Learned From Only National Poll That Consistently Showed Donald Trump Leading”
And just last week, Donald Trump praised the poll for doing a great job and presumably getting it right.
The problem is that the USC/LA Times is one of the few polls that got it wrong. Read the rest of this entry »
February 15th, 2017
It bugs me that companies would use customer feedback surveys not for information or learning, but to manipulate customers into buying from them again. On the other hand, how many companies inundate us with those zillions of customer satisfaction surveys because they truly care about learning? What they really care about are their “metrics,” their dashboards, and their bonuses.
So here goes: I’ll tell you one way to manipulate your customers into liking you more (and I’ll resist telling you, “Don’t do this, because it is kind of obnoxious.”) Read the rest of this entry »
February 8th, 2017
It all depends on what counts as a “question,” and also on how complex those questions are. A grid question with multiple rows takes a longer to answer than a simple yes-or-no question. A question that allows you to select multiple items from a laundry list of options takes longer than a standard 4-point scale. Ranking, in order, the top three to five items from that laundry list will take even longer.
Not that anyone is vying for the prestigious honor of being a survey-timing-guru, but we did feel proud to discover that Versta Research is now cited in the Federal Register as providing authoritative guidance on survey length. A notice filed by the Department of Transportation in Vol. 81, No. 193, October 5, 2016 says: Read the rest of this entry »
February 1st, 2017
Suppose your colleague left her job tomorrow and you needed to recreate, and double check, all of the work she produced for a research study just completed. You probably couldn’t do it.
Think of every step along the way: exactly how the survey was programmed, how data were cleaned, why certain respondents and not others were cut, how data were tabulated and bucketed into groups, how analyses were generated, how a segmentation algorithm was finalized, how charts were populated, and how conclusions were formulated.
Ideally, all of this should be documented. Indeed, the most unsexy, and therefore neglected, but critically important trend in marketing and research analytics today is this: reproducible research.
“The term reproducible research refers to the idea that the ultimate product of research is the paper along with the laboratory notebooks and full computational environment used to produce the results in the paper such as the code, data, etc. that can be used to reproduce the results and create new work based on the research.”
How does one do it? Here is a handy list of ten simple rules to follow for reproducible research, recently published by a team of collaborating academic researchers from Norway and the U.S.: Read the rest of this entry »
January 25th, 2017
Research Lessons from the Polling Mess
I have been surprised by the number of people—including friends and even veteran researchers within the market research industry—who believe that the outcome of the 2016 presidential election proved once and for all that surveys and polling do not work.
Why am I surprised? Because it is simply not true.
In the Versta Research Winter 2017 Newsletter we lay out the evidence. It’s not complicated, and I’m pretty sure you will agree. Read the rest of this entry »
January 18th, 2017
Reflecting a shift in how applied research is being organized and used in businesses today, 2017 marks the end of two important market research industry groups, and the beginning of a new one. Gone is the Council of American Survey Research Organizations (CASRO) founded 37 years ago. Gone is the Marketing Research Association (MRA) founded 60 years ago. The two have merged into a new group called the Insights Association.
At Versta Research we were one of many who voted in favor of the merger. But I’m a wee bit down on the name “Insights Association.” Read the rest of this entry »
January 11th, 2017
Almost certainly more than you think. If you are purchasing access to survey respondents from panel providers, or from survey software providers, or from any other source beyond your own list (and even then, how carefully are you controlling access to your survey?) you are probably getting fraudulent data from automated bots or from survey-taker farms.
We experienced this problem just last week, and here is what we learned: Read the rest of this entry »