Helping you understand your competition
 
 
QUICK LINKS

       Home
       News and Knowledge
            Press Releases
            Versta in the News
            Newsletters
            Knowledge
       Services
       Leadership
       Professional Affiliations
       Newsletter Sign-Up
       Versta Research Blog
   
 
NEWSLETTER SIGN-UP
 
   
   
LET'S TALK

 

   

Newsletters

JULY 2015


Dear Reader,

Earlier this year we trumpeted the idea of corporate research groups helping internal business partners do their own research. Sometimes is makes tons of sense, and sometimes not. So when it comes time for something more complicated, how do you educate a client about what matters and what doesnít in terms of quality and rigor?

We decided that a five star rating system might help. Add a Star to Your Survey outlines key differences between one star surveys and five star surveys and everything in between. Share it with your colleagues and internal clients. Remind them that the differences are real. Help them understand when and how those differences really matter.

Other items of interest in this newsletter include:

   Make Your Incentive Drawings Legal
   Merging Zip Codes with Census Data
   And You Thought Focus Groups Were Expensive!
   Try to Untangle This Knot of Numbers
   What If People Say No to Your Research?
   When Net Promoter Scores Donít Make Sense
   The New Patriotism: Filling Out Census Surveys
   Talking about Infographics at LIMRA
   Define Millennials Using These Years
   Beautiful (and Mostly Accurate) Venn Diagrams
   Quick-Testing Surveys for Mobile
   Yes, You Can Use Grids on Mobile Surveys

We are also delighted to share with you:

   Versta Research in the News

This section features, among other news, information about our LIMRA presentation on infographics and the latest news about research we did for Wells Fargo and Teva Pharmaceuticals.

As always, feel free to reach out with an inquiry or with questions you may have. We would be pleased to consult with you on your next research effort.

Happy summer,

The Versta Team

 Add a Star to Your Survey

S

urveys have gotten easier and easier, and worse and worse, and perhaps the two go hand in hand.

With so many powerful and inexpensive tools available, even people challenged by science and statistics can now distribute polls, crunch numbers, extract metrics and report “insights.” So they churn out lots and lots of surveys (and we do, too). Some of the results are good, some are okay, many are terrible. While I like to think that none of our work is terrible, sometimes we are tasked with the impossible: trying to fix or improve a survey when other parts of the work are fundamentally flawed.

But itís worth understanding what makes for great surveys versus okay surveys versus terrible ones because we learn from the disasters as well as the triumphs. The disasters are one star surveys. Surprisingly many organizations do them, and we stumble upon them every day. The triumphs are five star surveys. Few organizations do them (and for good reasons) but they provide superlative standards that we can emulate.

In between the one star surveys and the five star surveys there are two star, three star, and four star surveys as well. Here are the defining characteristics of each, proceeding from best to worst:

★★★★★ 5 Star Surveys

Five star surveys are the gold standard for what all surveys could be if only we had the time and money. Think U.S. Bureau of Labor Statistics, or the U.S. Census Bureauís American Community Survey. Several large, academic research centers, most often funded through large federal grants, produce surveys of this caliber. They are characterized by:
  • Rigorous scientific protocols with surveys that are painstakingly designed, tested, and executed in all phases of work including sampling, questionnaire wording, scale construction, recruitment, and data management.

  • Academic expertise that draws from the very top tiers of social scientists who are defining and advancing the boundaries of shared knowledge.

  • Important topics that are widely useful to both private businesses and public institutions, and that establish critical benchmarks for other research and measurement.

  • Huge budgets and long timelines, as in millions of dollars and many months if not years of planning and fieldwork to execute.
Even if you never come close to fielding a five star survey yourself, it is important to know about them. They provide amazing public resources you can use in your own work, and they set methodological standards against which all other surveys should be measured.

★★★★☆ 4 Star Surveys

Four star surveys are the five star wannabes, and thatís a good thing. They come as close as they can, with $30K instead of $3M, and with eight-week timelines instead of two years. They are characterized by:
  • Specialized expertise with personnel trained in academic five star research and who continue to follow and apply important trends in social scientific methods and knowledge.

  • Applied rigor that balances theory and practice, and carefully weighs the costs and benefits of competing priorities when it comes to sampling and fieldwork.

  • Relevant topics that are carefully linked to internal business objectives, or that answer specific questions of general public interest.

  • Affordable budgets and timelines but not super cheap and super fast, because the guiding needs are rigor, credibility, and defensibility.
If you are serious about investing in research, have important decisions to make, or important markets to reach, then a four star survey ought to be your goal. Who does four star surveys? The Pew Research Center, Versta Research, Gallup, Harris, Ipsos, GfK, NORC, and other research groups (maybe your group, too?) that have a sincere focus on bringing scientific social research to business.

★★★☆☆ 3 Star Surveys

Three star surveys offer rough, quick, and dirty quantitative views of a market. They can be valuable in the right circumstances. But most three star surveys masquerade as being more, in part because those who are conducting the surveys are not fully versed in the statistical and fieldwork issues that underlie more rigorous research. We consult for many firms who do these surveys, and we are always surprised that few understand how flimsy they are.

Three star surveys are characterized by one or more of the following:
  • Weaker expertise because the firms involved usually specialize in marketing, brand innovation, strategic planning, advertising, qualitative research, or technology.

  • Narrowly focused on issues that are specific to an immediate need but lack relevant and robust data that can speak to larger business questions or objectives.

  • Cheaper and faster because there is minimal effort to build a careful design, manage sample, control fieldwork quality, and clean data.

  • Good enough (one hopes) but inexact, and often just plain wrong, because shortcuts are embedded in the design, errors are overlooked, analysis is automated and simplified.
We did one of these surveys two months ago, but our objective was clear and limited: we needed a rough idea of brand awareness and purchase frequency to estimate costs for a larger survey about new products. Indeed, three star surveys can deliver a general sense of your market and give direction for better research (if needed), but please donít rely on them for decision-making or for building credibility.

★★☆☆☆ 2 Star Surveys

A survey we came upon recently described having to recruit “like crazy” to achieve “a statistically significant dataset.” How did they do it? By reaching out to all of their friends and contacts, which was cool because contacts turn into leads and sales. And then the survey results “drive traffic, shares, and likes” which was the goal of their research.

This is a two star survey. The authors get credit for picking a topic they believed in and wanted to learn about. But otherwise, two star surveys are characterized by:
  • Zero expertise as evidenced by nonsense statements about methods, samples, margins of error, and statistical significance.

  • Biased samples with an obsessive focus on size, no understanding of representation, and seemingly no awareness of multiple sources of error.

  • Numbers-focused, with statistics reported to absurd levels of precision, weak interpretations of what numbers might mean, and little or no depth about why the data matter.

  • Gimmickry that goes beyond genuine thought leadership and that moves dangerously toward SUGGING and FRUGGING with surveys that are primarily tools to persuade and sell to people.
If you care about quality, two star surveys are, in some ways, more depressing than one star surveys. They perpetuate ignorance and misunderstanding. They bumble along, ignore best practices, and declare false conclusions. But they are wrapped in sophisticated language that sounds semi-real. In no circumstances should you do them.

★☆☆☆☆ 1 Star Surveys

If two star surveys are so bad, is there anything left for a one star survey? Yes. One star surveys embody willful disregard, even mockery, of what surveys can and should be. They deliberately employ worst practices to grab attention from respondents or from an audience.

You know those offensive questionnaires you get from your congressional representatives asking if you support a strong America or a weak one? Those are one star surveys. They are characterized by:
  • Willful rejection of rigor with phony questions designed to elicit already-known answers, and with convenience-based or deliberately biased sampling.

  • Unimportant topics or questions about things like space aliens invading America during presidential elections, designed to be fun or humorous or simply to score political points.

  • Chart-junk graphics that are distorted, cartoonish, or silly, and that violate multiple principles of effective graphic design for the display of quantitative information.

  • Easy to execute, fast, and cheap because there are no governing principles of sampling, data quality, coding, or effective analysis.
Politicians arenít the only ones promulgating one star surveys. We saw one several weeks ago in the New York Times Magazine about cursing in conversation. I have the print version in front of me, but it seems to be absent from any kind of online archive. Maybe the New York Times staff, too, realized how embarrassingly bad it was?



The implications of all these stars matter. If your business partners are making decisions, you want them referencing rigorous data. If they are publicizing research for thought leadership, you want to bolster their credibility. A survey, by itself, does not guarantee either.

So how do you kick things up a notch and add a star to your survey? Put more expertise and depth of scientific knowledge behind it. Add another level of rigor to its design and execution. Focus a bit more on the importance or relevance of the topic being explored. Communicate the story that the data tell. Emulate five star surveys. Aim for four star surveys. Avoid most of the rest.

BACK TO TOP
 Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

Make Your Incentive Drawings Legal
If you offer survey respondents a chance to win a prize in a randomized drawing, your survey becomes subject to federal and state laws regarding sweepstakes.

BACK TO TOP


Merging Zip Codes with Census Data
Here is an amazing, free resource if you ever need to link your market data to Census data: Quarterly zip code and census tract mapping files from Uncle Sam.

BACK TO TOP


And You Thought Focus Groups Were Expensive!
Looking for innovation in qualitative research? How about improvisational comedy. It might make you laugh. For sure it will make you cry when you see the price.

BACK TO TOP


Try to Untangle This Knot of Numbers
Even the fanciest professors writing in the fanciest newspapers have trouble turning research data into clear and compelling stories, as this example illustrates.

BACK TO TOP


What If People Say No to Your Research?
Every survey vendor should be well-versed in current knowledge about people who refuse to take surveys, and they should know how to address it. Here's the latest scientific review.

BACK TO TOP


When Net Promoter Scores Donít Make Sense
New data shows that for many products and services, even ecstatic, loyal customers will give low recommend scores, limiting the utility of an NPS approach.

BACK TO TOP


The New Patriotism: Filling Out Census Surveys
The Census Bureau's annual survey of Americans is always under attack from blathering politicians, but it is vitally important to all of us. Here's why it matters.

BACK TO TOP


Talking about Infographics at LIMRA
Just back from the LIMRA conference: Here is a super snazzy infographic handout from our presentation on “How to Create Spectacular Infographics for Market Research.”

BACK TO TOP


Define Millennials Using These Years
There are many conflicting, inconsistent ways to define generational cohorts like Millennials, Boomers, and Gen Xers. Here are some solid definitions we recommend.

BACK TO TOP


Beautiful (and Mostly Accurate) Venn Diagrams
Venn diagrams are useful conceptually, but often inaccurate and difficult to assess visually. Here is a tool that makes excellent, mostly accurate Venn diagrams.

BACK TO TOP


Quick-Testing Surveys for Mobile
Most Internet browsers have tools to simulate how web pages will look on mobile devices. Here is an example using Chrome to ensure that your surveys adapt correctly.


BACK TO TOP


Yes, You Can Use Grids on Mobile Surveys
Don't get rid of grids just because you want to optimize for mobile. A good survey platform can adapt them brilliantly, as shown in this example from our platform.

BACK TO TOP

 Versta Research in the News

LGBT Views on Money and Marriage
Versta Research conducted a survey of LGBT Americans for Wells Fargo in advance of the Supreme Court ruling on same-sex marriage.

BACK TO TOP


Physicians and Patients Uncomfortable Discussing Opioid Abuse
A Versta Research survey about prescription drug abuse for Teva Pharmaceuticals, the American Academy of Pain Management, and the U.S. Pain Foundation offers insights about new challenges in pain management.

BACK TO TOP


Third Wave of Clergy Health Study Completed
Versta Research has been working with the General Board of Pension and Health Benefits of the United Methodist Church since 2009 to understand and track clergy health. The 2015 research results are now available.

BACK TO TOP


Insights on Infographics Presented at LIMRA
Versta Research teamed up with Americaís Health Insurance Plans, Unum, and Sun Life Financial to offer tips, tricks, and best practices on research infographics at the LIMRA 2015 Marketing & Research conference.

BACK TO TOP