Market research can be the gold standard for data collection
The market research industry is (and has been, for some time) grappling with a significant challenge: the erosion of data quality. This issue, as JD Deitch articulates in his e-book The Enshittification of Programmatic Sampling, arises from systemic market failures that are stifling trust, reducing participant engagement, and limiting access to reliable first-party data. While these failures create real risks, they also present a critical opportunity for meaningful change.
Encouragingly, a substantive and necessary conversation is already underway. I agree with much of what’s being said – but also believe there are lessons from other industries that we cannot afford to ignore.
Understanding the problem: a “lemon market” dynamic
The survey data market exhibits the hallmarks of what economist George A. Akerlof termed a “lemon market.” In such markets, buyers cannot distinguish between high- and low-quality goods, and sellers cannot credibly signal the quality of their offerings. This lack of transparency depresses prices, drives out high-quality providers, and fosters an environment of distrust.
For the research industry, this dynamic is exacerbated by the complex two-sided nature of the market. On one side, brands seek actionable consumer insights. On the other, research participants – the true source of first-party data – engage in what I call “labour-like” agreements, providing their data in exchange for compensation. The intermediaries that connect these two sides often operate with limited transparency, further compounding the problem.
Key turning points in programmatic sampling
Several historical developments have contributed to the current state of the market:
Web 2.0: As digital platforms became more engaging, the intrinsic appeal of online surveys waned, making it harder to attract participants
Expanding the pool: Intermediaries prioritised expanding the participant base, using approaches like river sampling. While this approach reduced costs, it introduced complexity and opacity into the sampling process
Chasing yield: The industry’s reliance on advanced algorithms and marketplaces to maximise yield has increased the liquidity of data but at the expense of transparency and quality.
These trends have created a system strained by misaligned incentives, declining participant engagement, and pervasive data quality issues.
Finding solutions: transparency and collaboration
To address these challenges, we must first tackle the information asymmetry at the heart of the lemon market. Transparency is the foundation for restoring trust between buyers, intermediaries, and participants. Just as independent standards have helped bring accountability to other industries, market research needs structures that provide clear, credible indicators of data quality.
By introducing independent signals of data quality, akin to credit ratings in financial markets or brand safety metrics in digital advertising, we can incentivise higher standards and foster a healthier ecosystem. A clearinghouse model, where data quality is openly evaluated and shared across the industry, could be a game-changer, mitigating information asymmetry while promoting trust and collaboration.
Moreover, the industry must address what Deitch describes as a “prisoner’s dilemma,” where firms hesitate to improve quality for fear of losing their competitive edge. This dilemma persists because of a lack of communication and coordination. A more open approach, where key stakeholders align around shared quality benchmarks, can help resolve this standoff and create industry-wide progress.
Reasons for optimism
Despite the challenges, I believe the industry is at a turning point. Several emerging trends give me hope:
Increased demand for high-quality data: The rise of generative AI and the need for robust training datasets are driving demand for reliable first-party data, creating new opportunities for market research
Innovative platforms and models: Companies are experimenting with vertically integrated platforms, improved participant incentives, and advanced data collection techniques to boost engagement and quality
Proven solutions from other industries: Finance, healthcare, and digital advertising have all implemented mechanisms to increase transparency, enforce standards, and protect data integrity. Market research can and should follow suit.
By adopting these approaches, we can position market research as the gold standard for first-party data collection, meeting the growing needs of modern businesses while rebuilding trust and elevating quality.
The road ahead requires bold thinking and collective action, but I am confident in our industry’s ability to rise to the challenge. Together, we can unlock new opportunities and ensure that market research continues to thrive in the years to come.
Bob Fawson is founder and chief executive of Data Quality Co-Op

We hope you enjoyed this article.
Research Live is published by MRS.
The Market Research Society (MRS) exists to promote and protect the research sector, showcasing how research delivers impact for businesses and government.
Members of MRS enjoy many benefits including tailoured policy guidance, discounts on training and conferences, and access to member-only content.
For example, there's an archive of winning case studies from over a decade of MRS Awards.
Find out more about the benefits of joining MRS here.
0 Comments