Ensuring Quality

Visiting lecturer Luke Plonsky spoke on how to ensure high quality in research.

IMG_5457PROVO, Utah (September 24, 2015)—You wouldn’t buy a home that you knew had been built with low-quality materials and tools when another was available for the same price but at a higher quality. Likewise, researchers are wary of studies conducted with low-quality data and practices.

Luke Plonsky, assistant professor of applied linguistics at Northern Arizona University, visited Brigham Young University to give a series of lectures and encouraged researchers to question their methods and find the most efficient ways to produce high-quality studies.

But how do you determine a study’s quality? To answer this question, Plonsky spent years examining hundreds of research articles – not for their findings, but for the methods that their authors had used to reach their conclusions. He looked at the decisions they made, the data they collected and how they reported.

Eventually, Plonsky arrived at his own, two-part definition of quality: “(a) adherence to standards of contextually appropriate methodological rigor in research practices and (b) transparent and complete reporting of such practices.” In other words, using methods appropriate to the situation and being honest in how they (and the data) were used.

Too often, researchers will employ methods that are far more sophisticated than their studies require. Plonsky identified NHST (Null Hypothesis Significance Testing) in particular as an unreliable practice in which a massive amount of data is collected just to be cut down into a crude yes/no answer. He then quoted theorist Bruce Thompson who explained the process as “tired researchers, having collected data on hundreds of subjects, then conduct a statistical test to evaluate whether there were a lot of subjects, which the researchers already know, because they collected the data and know they are tired.”

IMG_5506At the other extreme, though, are researchers using methods that aren’t sophisticated enough. “Often we run multiple statistical tests on the same data that each account for independent variables, because that’s what we’re comfortable with, when actually we could be using a statistic that takes all of our independent variables into account in the same procedure,” explained Jesse Egbert, assistant professor in BYU’s Department of Linguistics and English Language. “This practice of running many univariate analyses on the same data can increase our chances of arriving at the wrong conclusions.”

Research thus becomes a balancing act: using methods simple enough that they don’t overcomplicate the data, but sophisticated enough that the data is fully utilized. Though new methods are developed and become popular, there will never be a cover-all solution that meets the needs of every research project. The highest quality studies, according to Plonsky, are those that carefully tailor their practices to best analyze their unique data.

Whatever those practices are, however, Plonsky stressed the need for transparency. Through an anonymous survey, an alarming percentage of researchers confessed to questionable reporting practices; roughly one in six admitted to altering their data, whether to achieve desired results or statistical significance, or due to pressure from funders.

Research practices will continue to develop as researchers continue to learn how best to use their data, understanding that their choices will ultimately decide what they find.

—Samuel Wright (B.A. American Studies ’16)

 

Samuel covers the Department of Linguistics and English Language for the College of Humanities. He is a junior pursuing a degree in American studies with a minor in editing.