Data quality is a persistent and critical theme in market research. As companies increasingly turn to multiple data sources to make business-critical decisions, the pressure is on to deliver quality outcomes from primary research.
Knowing that quality begins with survey responses, we at DM2, along with our partners at Rep Data and Research Defender, decided to tackle current issues head-on by conducting in-depth research-on-research. During the study, we assessed the efficacy of applying different screening and data quality techniques in a survey setting. Our hypothesis was that the sweet spot for delivering quality would require a coupling of soft skills such as expert, consultative project management with techniques such as agnostic sampling for representative audiences and advanced fraud mitigation methods.
For the study, we conducted a survey in early Q2 2021 among n=2,002 gen pop consumers. Rep Data sourced samples equally from four of the research industry’s larger online sample providers. Completes were evenly distributed across five cells, with providers delivering n=100 to each cell with consistent age and gender quotas. This provided a basis for data comparison among five overall cells using various quality assurance techniques including Research Defender’s proprietary digital fingerprinting, fraud identification, text analytics, and respondent-level tracking.
The five cells in the study included:
- Raw data: untreated data to provide a baseline for comparison.
- Data screened only with Research Defender’s digital fingerprinting and fraud identification, a proprietary technique that examines potential respondents based on their past external activity before they engage in a survey.
- Data screened only with Research Defender’s respondent engagement and open-end response analysis tool, which measures and scores a respondent’s engagement in real-time by analyzing the overall quality and thoughtfulness of open-end responses.
- Data treated on the backend only with Research Defender’s respondent-level tracking solution, a method that tracks a respondent’s rate of activity across the market research ecosystem, and is useful in flagging professional survey takers.
- Layered approach: data is subjected to methods two, three, and four – all applied together.
We measured our results with a previously established quality scoring methodology that has context and benchmarks – something we’ve dubbed the “Qscore.” This methodology leverages trackable, quality-oriented question sets used for many years to assess respondent quality and characteristics. The longevity of these question sets provided data that gave significant benchmarks for the United States, from 50K+ interviews in the past year alone. In addition, some standard questions from sources such as the U.S. Census were included to give a foundation for outside comparisons.
We calculated Qscores at the respondent level to provide a baseline for comparison, and reviewed aggregate scores by demo group, provider, and data quality technique. A key finding was that layering data quality techniques positively impacts research outcomes, and can lead to a clean, healthy and efficient market research ecosystem.
Adding to the quality equation, we were able to infer that unbiased sourcing delivers more representative results and that expert project management for fieldwork eliminates common challenges in the data collection process. These findings reinforce the complexity of the issues facing researchers today, and how a deft balance of techniques during the data collection and fieldwork stage is needed to produce the very best research outcomes.
The post Quality Insights Depend on Quality Data Collection Processes first appeared on GreenBook.