<img alt="" src="https://secure.east2pony.com/208651.png" style="display:none;">

Not All Panels (and Panelists) Are Created Equal!

Why It’s Important to Ask Questions about Panel Quality.

Have you or one of your research vendors used a third-party panel for a survey research project lately? Do you tend to rely on one or two providers that you’ve had good experiences with in the past? Then read on…

Like many researchers, I have a couple of go-to providers that are usually first in line when I am putting together proposals for our clients. They’re at the top of the list because they can usually get me the number of completes I need, they’re cost competitive and they’ve delivered in the past (note and apology to our non-research readers: “completes” is our jargon for “completed interviews”).  Oh, and sending nice gift baskets during the holidays never hurts either.

But when it comes to selecting panel providers, falling into habits and relying on existing relationships can be dangerous. As the marketing research industry has been increasingly relying on panels for surveys, some research firms may not always be doing enough due diligence to ensure that the quality of response is good enough that clients should rely on it for important business decisions.

A few things are at play:

Motivation. Panel providers incent their respondents with various rewards (and rightly so). But in some cases, panel members are so motivated by the incentives and rewards offered for completing surveys that they give little thought to the questions being asked and click through unthinkingly just to finish.

Legitimacy. This is particularly important when looking at business audiences or respondents in harder to reach segments. We’ve seen respondents claiming to be Operations Directors or Product Managers only to find that they were actually massage therapists or elementary school teachers who were checking a box they thought would get them screened into a survey (no offence to either profession!).

Bots.  A more recent and alarming concern is fraudulent survey responses where users can employ auto bots to complete hundreds or even thousands of online surveys with no human interaction with the survey. For example, check out this survey bot http://www.ultimatesurveybot.com/ , claiming to be “the single greatest automated cash machine of the decade!” On another site “Here is survey bot attempting to complete a survey with no given information. I can tell you that this did work and running this on 6 Surveys a day for two weeks (fully automated of course) got me the total sum of £14.95p, with no user interaction whatsoever!”

With all these potential quality issues, I find it odd that so few of our research buyer clients ask which panel we are using or how we go about vetting the providers we work with to help ensure data quality.

Of course, when we receive the data and prep for analysis, we do what any respectable marketing research provider should do – check the data for speeders who complete the survey in an unreasonably short time; review open-ends to remove any respondents who provide nonsensical comments (just yesterday I removed a respondent for commenting about his former life as a dog – true story!); identify and eliminate straight-liners, look for discordant responses, and so on.

However, that’s all after the fact. What we need to be doing is minimizing the risk of getting bad data in the first place.

What are some of the things we look for when evaluating panel providers on quality? We want to ensure that they are evolving their technologies to identify low quality respondents and fraudsters especially, for instance using IP geolocation and third-party identification verification technology. Other things we look for include:

  • Whether they require panelists to submit proof of identification and residence
  • Whether they assign panelist quality scores based on their completion history
  • What warnings and penalties they communicate to panel members in their terms of service for low quality and fraudulent response
  • What actions they actually take against low quality members and fraudsters
  • Whether they compare demographic information collected in surveys against their panelist profile to ensure they align (e.g. panelist age)
  • What, if any traps they employ within surveys to detect fraudsters and respondents who are not reading the survey questions when answering
  • Whether they target sample to members who reflect the screening criteria (to reduce the risk of a member screening through who may not in fact qualify)
  • Whether they need to partner with other panels to meet the target completions, and if so who do they partner with
  • Whether they in fact have their own panel (some so called panel providers are in fact actually just brokering a multitude of panels they don’t own or manage)
  • What the panel member experience is actually like (we do this by joining the panel ourselves)

To sum up, all panels are not created equal. Purchasers of marketing research services need to consider factors beyond selecting the provider with a reasonable cost, feasibility, and ability to meet timelines. They also need to be wary of relying on past history and relationships with panel providers and ensure that they are keeping up with advances in fraud detection and have active practices to manage panelist quality.

Written by Christine Sorensen

A Vice President at Phase 5, Christine Sorensen is a veteran researcher expert in both quantitative and qualitative techniques. Passionate about client service, Christine has extensive experience on studies across a range of topics including brand and communications, interactive technology, customer satisfaction, and product development. Christine holds an MA in Communications and Culture from York-Ryerson Universities and BA in Sociology from the University of Toronto.