Truly Amazing Stats about Phone Surveys

July 19th, 2017

I’ve been reading through a report from the AAPOR Task Force on The Future of U.S. General Population Telephone Survey Research, which I thought might be interesting and provocative, given how oddly reluctant the organization has been to embrace online methods. Only a few years ago AAPOR (the American Association of Public Opinion Research) finally acknowledge that online, non-probability based methods may be effective.

Alas, the conclusions of this report are exceptionally bland, with statements like: “There are many clients and researchers who still rely on the telephone for all or at least a part of the surveys that they sponsor and conduct. And, the Task Force anticipates that the telephone will remain an important mode for surveying the general public of the United States for many years to come.”

But buried within the 74-page report are some statistics that will make you blanch if you are thinking that a phone survey is the way to go, or if you have no other option but to conduct one (indeed, there are still some surveys that cannot be done online). Here are some of those amazing stats worth knowing:

Read the rest of this entry »

Help Save the Census Today

July 12th, 2017

It’s the week after Independence Day. You’re doing your best to re-focus on work after the holiday break. If only it could have lasted a few days longer! I can’t bring you a few more days vacation, but I can offer this suggestion for making the holiday last just a wee bit longer: Bring some patriotic fervor to your work by reminding your colleagues and your representatives in Congress how crucial the Census is to your work.

Read the rest of this entry »

Lousy NYT Survey Makes Researcher Cringe

July 5th, 2017

Bad surveys bug me. Bad surveys touted in The New York Times bug me even more, as I expect only the best from them. Bad surveys on the front page of The New York Times with ridiculous and sensationalist findings bug me so much that they inspire blog posts.

A front page article on July 2, 2017 claimed that many workers “cringe” when working with people of the opposite sex, and that “nearly two-thirds say people should take extra caution around members of the opposite sex at work.”

How do they know? They asked people questions like this:

Read the rest of this entry »

How to Ask Gender on Surveys

June 28th, 2017

Gender used to be one of the easiest questions to write for a survey. There was male and there was female, so we simply asked: “Which are you?” But our culture has begun acknowledging the fluidity of gender identity and gender assignment, and now, too, must survey research. Just this week we had a potential survey respondent send us a note expressing a willingness to participate, but getting stuck at the gender question because neither male nor female applied.

But how should we start asking about gender? Read the rest of this entry »

Does Starting with a Story Bias Your Findings?

June 21st, 2017

Two audience members at our LIMRA talk (see Public Studies: Advice for the PR Team and Advice for the Research Team) asked questions about biased research findings. Doesn’t starting with the research story and writing dream headlines lead you down a path of finding or confirming what your internal client wants to hear?

I suppose it could bias your research, but in reality writing dream headlines is meant to be the business-equivalent of  writing out hypotheses in science. You need a clear statement of your research questions and what you expect to find. It is the only way you can design precise methods of inquiry and testing so that you end up with real answers instead of random (and biased) speculation.

And just like in science, you can have all the dream headlines you want. But after that, you need to purposefully design your survey to look for disconfirming evidence. This is the hallmark of scientific inquiry, and it is what makes for truly good research (see: A Quick Puzzle for Market Research Brains).

After you have your dream headlines, then, here is what you need to do: Make it easy for your respondents to tell you the opposite of what you “want” to hear. In other words, make it easy for them to say no.

Here is an example from a recent survey we did. Read the rest of this entry »

PR Studies: Advice for the Research Team

June 14th, 2017

Research for PR is definitely not “research lite” as my former boss used to think. It is the opposite, and requires more attention to rigor than strategic research. Why? Because no other research gets such intense scrutiny from people outside our firm and from audiences beyond our clients. It can’t be “directional.” It has to be right. It has to withstand the glare of media attention and a skeptical public.

That’s a challenge we love, which I shared with a generous audience at LIMRA’s 2017 Marketing and Research Conference. Versta Research presented a session on Finding Stories: Building Better Research for PR Campaigns. We offered a “how to” on designing surveys that ensure compelling stories regardless of how the data fall out.

The presentation ended with five tips for the PR side and five tips for the research side of a campaign. Last week we provided a summary of advice for the PR team. Now it is the research team’s turn. If you are a research professional working with your PR team and hoping to turn data into stories, here are the most important things to keep front and center in your effort:

Read the rest of this entry »

Public Release Studies: Advice for the PR Team

June 7th, 2017

Just back from LIMRA’s 2017 Marketing and Research Conference, I am happy to report a successful and productive “meeting of the minds” from both the Marketing Side and the Research Side of this uniquely blended gathering of financial services professionals.

No matter what some may say, research is not part and parcel of marketing. Nor should it be. We have different skills and perspectives, and we are grounded in different disciplines.

Versta Research presented at this conference. Our focus was on Finding Stories: Building Better Research for PR Campaigns. We offered a “how to” on designing surveys that ensure compelling stories regardless of how the data falls out. The presentation ended with five tips for the PR side, and five tips for the research side of a campaign. So in case you missed the conference or our presentation, here is a summary of advice for the PR team:

Read the rest of this entry »

Versta Blog Makes Public Opinion Quarterly

May 31st, 2017

This familiar sentence opens the lead article of Public Opinion Quarterly’s 2017 special issue on survey research:

“Telephone surveys are dead, claim some survey researchers, and should be replaced with nonprobability samples from Internet panels.”

Hmm, I’ve said something similar to that in the past, I thought to myself. I guess I’m not the only one. And two pages later:

“…some claim that the persistent low response rates attained from phone probability sample surveys render such probabilistic approaches “almost dead” (Hopper 2014). As a possible replacement, many have touted that “online [opt-in] surveys…work really well” (Hopper 2014)…”

Seeing “Hopper” still didn’t register, as I thought how funny there is another person with my name toiling away in the halls of academia on the same issues I care about. It was not until I flipped to the back references that I realized, “Hey, they’re citing our blog!” Read the rest of this entry »

Five Cautions for Crowdsourcing

May 24th, 2017

There is a small academic niche in market research that relies heavily on extremely cheap crowdsourcing for data. But before getting up in arms about the reliability and validity of such data, you should know this: They used to rely heavily on students enrolled in their college classes for data. Remember having to work six hours as a “laboratory subject” for Psych 101? Turns out it is cheaper, faster, and easier to find volunteers for these experiments on Amazon’s Mechanical Turk. The Journal of Consumer Research reports that 43% of the behavioral research it published last year was “conducted on the crowdsourcing website Amazon Mechanical Turk (MTurk).”

Market researchers are always looking for new ways to make work cheaper, faster, and easier. So if you are considering crowdsourcing as an option, take a look at JCR’s most recent tutorial, “Crowdsourcing Consumer Research.” It assesses “the reliability of crowdsourced populations and the conditions under which crowdsourcing is a valid strategy for data collection, and then proposes “specific guidelines for researchers to conduct high-quality research via crowdsourcing.”

Here are five important guidelines they offer, highlighted here because they have clear relevance for all types of sampling, not just crowdsourcing: Read the rest of this entry »

When to Use Multi-Check vs. Yes-No Questions

May 17th, 2017

Here are two ways you might ask a question to document multiple behaviors, purchases, interests, etc. They seem like they would be equivalent, but they are not.

The first is called a multi-check format:

Read the rest of this entry »