Versta Blog Makes Public Opinion Quarterly

May 31st, 2017

This familiar sentence opens the lead article of Public Opinion Quarterly’s 2017 special issue on survey research:

“Telephone surveys are dead, claim some survey researchers, and should be replaced with nonprobability samples from Internet panels.”

Hmm, I’ve said something similar to that in the past, I thought to myself. I guess I’m not the only one. And two pages later:

“…some claim that the persistent low response rates attained from phone probability sample surveys render such probabilistic approaches “almost dead” (Hopper 2014). As a possible replacement, many have touted that “online [opt-in] surveys…work really well” (Hopper 2014)…”

Seeing “Hopper” still didn’t register, as I thought how funny there is another person with my name toiling away in the halls of academia on the same issues I care about. It was not until I flipped to the back references that I realized, “Hey, they’re citing our blog!” Read the rest of this entry »

Five Cautions for Crowdsourcing

May 24th, 2017

There is a small academic niche in market research that relies heavily on extremely cheap crowdsourcing for data. But before getting up in arms about the reliability and validity of such data, you should know this: They used to rely heavily on students enrolled in their college classes for data. Remember having to work six hours as a “laboratory subject” for Psych 101? Turns out it is cheaper, faster, and easier to find volunteers for these experiments on Amazon’s Mechanical Turk. The Journal of Consumer Research reports that 43% of the behavioral research it published last year was “conducted on the crowdsourcing website Amazon Mechanical Turk (MTurk).”

Market researchers are always looking for new ways to make work cheaper, faster, and easier. So if you are considering crowdsourcing as an option, take a look at JCR’s most recent tutorial, “Crowdsourcing Consumer Research.” It assesses “the reliability of crowdsourced populations and the conditions under which crowdsourcing is a valid strategy for data collection, and then proposes “specific guidelines for researchers to conduct high-quality research via crowdsourcing.”

Here are five important guidelines they offer, highlighted here because they have clear relevance for all types of sampling, not just crowdsourcing: Read the rest of this entry »

Good Reasons to Ask Bad Questions

May 10th, 2017

In the Versta Research spring newsletter, Build a Better Customer Satisfaction Survey, we mentioned—but did not speak to—the seventh question in our newly developed survey for clients. It was added at the last minute. We put it right at the top, so it is the first question you see.

If you didn’t yet test drive the survey (please do—it gives you a peek into the overall aesthetics and functionality of how we build all manner of surveys), here is that seventh question:

A Not-Very-Useful Question

Why didn’t we address this question in the article? Because the focus of that article was on super-duper useful questions. And unlike the other six questions, this one is not terribly useful. In one sense, it is a bad question. Read the rest of this entry »

Build a Better Customer Satisfaction Survey

April 26th, 2017

We have never deployed a customer satisfaction survey for our own customers. Why? Because most CX surveys are not designed to help customers.

But we’ve just taken inspiration from a unique survey from Skype for Business, in order to Build a Better Customer Satisfaction Survey. It’s our feature article in this quarter’s newsletter. It highlights the serious problems with most CX surveys, and offers a strategy to fix them. (Hint: Dump the NPS question!)

 What’s wrong with customer satisfaction surveys today? To answer that, I reflected on my own reasons why I, as a customer, almost never fill them out: Read the rest of this entry »

Try Using Tables Instead of Charts

April 19th, 2017

One memorable lesson from reading Edward Tufte’s books about visual displays of quantitative information is that charts are not the only way to display data. Indeed, they are sometimes a worse way. We should always at least consider a table as a superior alternative:

Tables are clearly the best way to show exact numerical values, although the entries can also be arranged in semi-graphical form. Tables are preferable to graphics for many small data sets. A table is nearly always better than a dumb pie chart; the only worse design than a pie chart is several of them, for then the viewer is asked to compare quantities located in spatial disarray both within and between pies.

So when you find your self stuck trying to visualize data graphically, either because it is too simple or too ridiculously complex, try using a table instead. And remember that beautiful, compelling data tables that instantly communicate information require as much thoughtful design as charts do.

Here are a few tips for designing tables that Turn Data into StoriesTM: Read the rest of this entry »

A New KPI for Jargon-Free Research

April 12th, 2017

When Dilbert starts making fun of the most common words in market research—actionable insights—it’s time to consider letting them go.

And now that nearly every marketing and research professional, every market research firm, and every data analysis tool touts a unique ability to offer actionable insight, I think we can now safely agree that the phrase is over-used and almost meaningless.

Let go of your jargon.

Why? Writer and storyteller Stephanie Buck offers three great reasons:

  1. There is usually a better way to say things
  2. Buzzwords lose meaning
  3. Using jargon doesn’t make us sound smarter

And how to you let it go? She suggests three strategies: Read the rest of this entry »

Please Do Not Gamify Your Surveys

April 5th, 2017

Which of these two questions do you find more annoying, question A or question B?

A. Foodie TV has announced a competition to select the best destination for food. What can your city do to win?

B. In terms of the restaurant choices, what is important for a city to offer?

Now which of these two questions do you find more annoying, A or B? Read the rest of this entry »

Statistically Significant Sample Sizes

March 29th, 2017

There are no magic numbers for sample size. There is no such thing as a statistically significant sample. There is no such thing as a statistically significant sample size.

Unfortunately, those two words—statistically significant—are bandied about with such abandon that they are quickly losing their meaning. Even people who should know better (the data wonks at Google Surveys should know better, right?) are saying ridiculous things as they promise to help you “generate statistically significant results.”

Here is a useful passage from the Reference Manual on Scientific Evidence, compiled by the Federal Judicial Center and the National Research Council: Read the rest of this entry »

Data Geniuses Who Predict the Past

March 22nd, 2017

If there is one thing I hope you remember from your college statistics class, it’s this: Correlation does not imply causation. This is especially important to remember in our world of big data. Any large dataset will have hundreds of thousands of correlations, but most of those correlations will reflect purely random occurrences that mean nothing.

Welcome to the world of “predictive analytics” – a fancy term for statistical efforts to sift through lots of data looking for correlations, most of which mean nothing.

And for today’s statistics lesson, welcome to the world of higher education where administrators may have forgotten that correlation does not mean causation. A recent feature article in the New York Times described how big data is being used to predict success among college students: Read the rest of this entry »

Dump the Hot Trends for These 7 Workshops

March 15th, 2017

If you are a market research professional, I urge you to look for continuing education beyond the MR industry groups such as the Insights Association, Quirk’s, etc. and instead look to a more academic organization like AAPOR, the American Association for Public Opinion Research.

Yes, their conferences will be the most boring you ever attend. (Ever been to an academic conference?) And their positions on issues facing the industry are painfully slow to evolve. (It took the organization years to accept the idea that online polling can work.)

But there is substance and depth to everything that AAPOR does. No hot trends or fading fads. Just enduring, substantive issues that are the core of what we do in market research. Read the rest of this entry »