Blog > Articles >
Estimated reading time:4 min read

The data quality reckoning: why trust must be the industry’s North Star

The market research industry is at a crossroads. Trust in the quality of survey data is under serious pressure and the consequences are already being felt. 

At the most recent Market Research Society (MRS) Conference, Shaping Our Future, this theme came through loud and clear: confidence in data is slipping, and the industry must act quickly to restore it.

A crisis of confidence

Recent findings that I learnt from the MRS event, taken from the Insights Association Buyer Sentiment Score, show just how deep the cracks are:

  • 1 in 3 buyers report a crisis of confidence in the data they receive.
  • 40% of survey data is being discarded due to fraud or low-quality responses.
  • 1 in 5 participants abandon surveys because of poor design or user experience.

These concerns are not abstract, from client and prospect conversations we are seeing that budgets are moving away from traditional panels, not out of innovation, but out of necessity, to make sure they get the best data possible. And if we don’t address this crisis now, we risk seeing further erosion in trust and spend.

The synthetic data dilemma

At the same time, the industry is charging ahead with synthetic data which is projected to make up 25% of research spend by 2030 (suggested by the Synthetic Data Generation Business Report 2024). But here too, quality matters. 

The terminology of  “Garbage in, garbage out” has been bound around quite a lot and with poor quality primary data comes biased, untrustworthy synthetic outputs. We’ve seen real-world examples, like Amazon’s AI hiring tool, where flawed training data led to skewed, exclusionary outcomes. 

But what does this mean for businesses? Synthetic data holds huge potential and if it’s developed and used correctly, we can achieve more robust views on all audiences, providing richer insight and further shaping strategies for, but not limited to, niche and lower represented groups or markets. But to get to this, we all need to work towards converting the garbage going in, into gold.

The call to action: industry-wide commitment to quality

It’s clear what’s needed, trustworthy data with a unified, proactive approach from brands, agencies, panels and research platforms.

With this need, a Global Data Quality Pledge has been created and is a step in the right direction for the industry. It is a chance for all parties involved to align on standards, share accountability, and collaborate on solutions. 

With everyone involved and on the same page, I think we’ll very quickly move to a stage where trust improves and data will once again be at the forefront of the majority of industry decisions, even amongst the challenging ‘non-data’ CEOs. 

However, as an industry we need to make sure that:

  • We are confident in the quality controls of our sample providers
  • We understand how our panels are curated and verified
  • We are prepared to challenge low-cost options if they compromise quality

What Attest is doing and what comes next

At Attest, quality has always been core to how we operate but we recognise that the challenge is evolving, and so must we. Some of what we’re already doing includes:

  • Rigorous quality checks across open text language, speeding through surveys, duplicative responses, impossible answers, bot detection and more.
  • Working with trusted aggregators and continuously reviewing our supply chain for integrity.
  • Encouraging clients to have open, positive conversations about data quality as we want to help flag issues early and solve them together.

But we’re also looking ahead:

  • We’re further exploring verified identity solutions to continue ensuring respondents are real, engaged people.
  • We’re considering how we can formally align with the Global Data Quality Pledge, and push our partners to do the same.
  • We’re building an internal framework to evaluate AI-generated data, testing for bias, trust and representativeness because we know the future of research will involve human and machine collaboration.

The bottom line: quality isn’t cheap, but it’s priceless

As an industry, I think we have to become more comfortable talking about the value of quality in research. That means pushing back when “cheap data” feels too good to be true and when you don’t quite agree with some findings that look a bit off. In most cases the data will be accurate, but holding everyone accountable will ensure this. 

It also involves some form of client education around the risk of low-quality inputs, be it the impact on internal decisions or creating strategies based on untrustworthy data. If additional time is needed to achieve good quality, that’s fine: your business will always appreciate it. 

Finally, it means that we all need to invest in better standards. Gone are the days a respondent should cost 30p per complete, if the industry expects a good quality respondent, they need to pay for them. 

It’s time to rebuild that foundation, one high-quality data point at a time.

Nicholas White

Head of Strategic Research 

Nick joined Attest in 2021, with more than 10 years' experience in market research and consumer insights on both agency and brand sides. As part of the Customer Research Team team, Nick takes a hands-on role supporting customers uncover insights and opportunities for growth.

See all articles by Nicholas