April 19th, 2017
One memorable lesson from reading Edward Tufte’s books about visual displays of quantitative information is that charts are not the only way to display data. Indeed, they are sometimes a worse way. We should always at least consider a table as a superior alternative:
Tables are clearly the best way to show exact numerical values, although the entries can also be arranged in semi-graphical form. Tables are preferable to graphics for many small data sets. A table is nearly always better than a dumb pie chart; the only worse design than a pie chart is several of them, for then the viewer is asked to compare quantities located in spatial disarray both within and between pies.
So when you find your self stuck trying to visualize data graphically, either because it is too simple or too ridiculously complex, try using a table instead. And remember that beautiful, compelling data tables that instantly communicate information require as much thoughtful design as charts do.
Here are a few tips for designing tables that Turn Data into StoriesTM: Read the rest of this entry »
April 12th, 2017
When Dilbert starts making fun of the most common words in market research—actionable insights—it’s time to consider letting them go.
And now that nearly every marketing and research professional, every market research firm, and every data analysis tool touts a unique ability to offer actionable insight, I think we can now safely agree that the phrase is over-used and almost meaningless.
Let go of your jargon.
Why? Writer and storyteller Stephanie Buck offers three great reasons:
- There is usually a better way to say things
- Buzzwords lose meaning
- Using jargon doesn’t make us sound smarter
And how to you let it go? She suggests three strategies: Read the rest of this entry »
April 5th, 2017
Which of these two questions do you find more annoying, question A or question B?
A. Foodie TV has announced a competition to select the best destination for food. What can your city do to win?
B. In terms of the restaurant choices, what is important for a city to offer?
Now which of these two questions do you find more annoying, A or B? Read the rest of this entry »
March 29th, 2017
There are no magic numbers for sample size. There is no such thing as a statistically significant sample. There is no such thing as a statistically significant sample size.
Unfortunately, those two words—statistically significant—are bandied about with such abandon that they are quickly losing their meaning. Even people who should know better (the data wonks at Google Surveys should know better, right?) are saying ridiculous things as they promise to help you “generate statistically significant results.”
Here is a useful passage from the Reference Manual on Scientific Evidence, compiled by the Federal Judicial Center and the National Research Council: Read the rest of this entry »
March 22nd, 2017
If there is one thing I hope you remember from your college statistics class, it’s this: Correlation does not imply causation. This is especially important to remember in our world of big data. Any large dataset will have hundreds of thousands of correlations, but most of those correlations will reflect purely random occurrences that mean nothing.
Welcome to the world of “predictive analytics” – a fancy term for statistical efforts to sift through lots of data looking for correlations, most of which mean nothing.
And for today’s statistics lesson, welcome to the world of higher education where administrators may have forgotten that correlation does not mean causation. A recent feature article in the New York Times described how big data is being used to predict success among college students: Read the rest of this entry »
March 15th, 2017
If you are a market research professional, I urge you to look for continuing education beyond the MR industry groups such as the Insights Association, Quirk’s, etc. and instead look to a more academic organization like AAPOR, the American Association for Public Opinion Research.
Yes, their conferences will be the most boring you ever attend. (Ever been to an academic conference?) And their positions on issues facing the industry are painfully slow to evolve. (It took the organization years to accept the idea that online polling can work.)
But there is substance and depth to everything that AAPOR does. No hot trends or fading fads. Just enduring, substantive issues that are the core of what we do in market research. Read the rest of this entry »
March 8th, 2017
At first I thought this was the dumbest survey question ever.
Why ask people what device they are using to take the survey, when you know from the platform meta-data what device they are using? Yes, even lame self-service survey platforms can (and should be) capturing information about device, operating system, IP address, start time, exit time, and a host of other data that most people ignore (and should not ignore!) because it does not seem relevant to their survey.
Then I realized (well, hoped) that our friends at Amtrak were actually geniuses. Read the rest of this entry »
March 1st, 2017
It amazes me that so many marketing research surveys are not responsive and optimized for mobile devices. I understand why they are not optimized, as decent mobile surveys are hard to design, and no, all the junk platforms that tout mobile optimization are not good options. But if you’re hiring a research or consulting firm, shouldn’t they be at the forefront of making your survey easy and accessible for the people you rely on (your respondents!) to supply data?
The answer is yes. And the other thing they ought to be doing is helping you monitor academic research-on-research to keep abreast of how mobile capabilities affect surveys, and therefore how they affect your data and research findings.
Here are some of the latest nuggets of insight we came across in a recent issue of Survey Practice: Read the rest of this entry »
February 22nd, 2017
It was so weird, right after the election, to see a media frenzy heaping praise and attention on the one poll that got it wrong. The USC/LA Times poll predicted a Trump win, with 47% of voters choosing Trump and 44% choosing Clinton. Right after the election, media headlines trumpeted its success:
“How One Pollster Saw Trump’s Win Coming”
“The USC/LA Times Poll Saw What Other Surveys Missed”
“What Can Be Learned From Only National Poll That Consistently Showed Donald Trump Leading”
And just last week, Donald Trump praised the poll for doing a great job and presumably getting it right.
The problem is that the USC/LA Times is one of the few polls that got it wrong. Read the rest of this entry »
February 15th, 2017
It bugs me that companies would use customer feedback surveys not for information or learning, but to manipulate customers into buying from them again. On the other hand, how many companies inundate us with those zillions of customer satisfaction surveys because they truly care about learning? What they really care about are their “metrics,” their dashboards, and their bonuses.
So here goes: I’ll tell you one way to manipulate your customers into liking you more (and I’ll resist telling you, “Don’t do this, because it is kind of obnoxious.”) Read the rest of this entry »