Failing to Listen to the Research… on Research

Picture of an upside down rubber duckGuest Post by Jeffrey Henning, 22 February 2019


As researchers, we often lament when our findings are ignored by the recipients of our studies. We advise against a product launch, and it goes ahead anyway. We suggest which priorities to feature in an upgrade, but politics wins the day instead of data. We determine the measures that are most predictive of customer retention, but NPS is picked instead.

Yet, as researchers, we’re not so different. We resist research on research that shows us what best practices we should follow. We use an outdated question format because it’s what we’ve always done. We ask panelists to re-supply their demographics, rather than use their profile data, valuing up-to-the-minute demographics at the expense of panelist satisfaction. We don’t weight online samples, because when we first started doing online surveys, the samples were too skewed to improve with weighting.

Nor am I any different. Reg Baker, the former executive director for the MRII, critiqued one of my questionnaires because I listed choices from negative to positive:

Terrible
Poor
Acceptable
Good
Excellent

He pointed out rich research on research showing that respondents expect such choices to range from most positive down:

Excellent
Good
Acceptable
Poor
Terrible

But it’s how I’ve done it for years, perhaps internalizing the numeric scale that I used to present:

  1. Terrible
  2. Poor
  3. Acceptable
  4. Good
  5. Excellent

At least I gave into the research that showed that numbers lowered the predictive reliability of such scales. I also remember thinking that such negative-to-positive scales were good because they forced respondents to think through the negatives first, and might keep ratings from being artificially inflated.

Reg pointed me to a framework of heuristics that participants’ use to interpret survey questions based on their visual appearance:

  1. “Middle means typical.” – The visual midpoint of a scale is regarded as the most typical response or the midpoint.
  2. “Left and top mean first.” – Participants expect choices to follow a logical order, beginning with the leftmost or topmost, and responses are slower when this rule of thumb is violated.
  3. “Near means related.” – Items in a grid are interpreted as being more closely related than items on subsequent screens.
  4. “Up means good.” – When asking two rating questions, one above the other, the topmost item being rated generally outscores the bottommost item being rated.
  5. “Like means close.” – Participants assume items are similar in concept if they are similar in visual appearance. For instance, when two ends of a scale are the same hue rather than different hues participants assume that they are closer conceptually.

This framework was developed by Roger Tourangeau, Mick P. Couper, and Frederick G. Conrad, who write, “Participants may assign meanings to visual cues, such as screen position, that are not actually intended to convey any meaning.” (Public Opinion Quarterly, Volume 77, SI, January 2013.)

So a mix of habit and rationale kept me doing things the wrong way. Even after Reg directed me to the literature, I’ve found it tough to change. It’s been an internalized habit.

It’s been humbling, but a reminder that what we were taught as fundamentals may now be obsolete. I recall my surprise when my mother retired in her 50s; she felt nursing had changed too much and her ingrained habits ran counter to some new best practices.

And, of course, many in our industry entered the profession from a different domain, without any formal basis in market research. Whether you are new to the field or an established veteran, I hope you’ll look at the rich training opportunities available. Your national research association most likely offers paid webinars and in-person workshops. In addition, the UGA/MRII now has a new line of self-paced online courses: Principles Express. These on-demand courses with interactive exercises can be completed in 9 to 12 hours and are designed with the busy researcher in mind, and each reflects a unique combination of pragmatic business advice informed by research on research.

The MRII/University of Georgia Center for Continuing Education is a proud sponsor of NewMR and has enjoyed a long partnership with Ray Poynter. In fact, Ray has written two of the new Principles Express courses:

  • Introduction to Data Analysis introduces you to the critical concepts common to the analysis of quantitative research data, with special attention to survey data analysis.
  • Advanced Analytic Techniques serves as a primer for some of the more advanced statistical methods you may encounter as a researcher, with greater attention to techniques which are frequently used with secondary data. Topics include: conjoint analysis, multiple regression, cluster analysis for segmentation, linear regression, perceptual mapping, and factor analysis.

We hope you’ll check them out.


Jeffrey Henning, PRC is the new executive director of the MRII, for which he is overseeing the rollout of the Principles Express line of on-demand MRX courses.

We need your support
We’re able to offer the content we offer, because of our sponsors and our supporters. If you’d like to help ensure we continue to develop content, new ideas, and talent, then please become a Patron, via our Patreon page.

One thought on “Failing to Listen to the Research… on Research

Comments are closed.