Fast-Food-Loving Cornell Prof Faces Ethical Scrutiny

Critical researchers cite data inconsistencies and instances of “self-plagiarism.”

Cornell Food and Brand Lab's Brian WansinkAssociated Press

For indispensable reporting on the coronavirus crisis and more, subscribe to Mother Jones' newsletters.

Update, September 20, 2018: Brian Wansink has announced his retirement, and Cornell University’s internal investigation has “found that Professor Wansink committed academic misconduct in his research and scholarship, including misreporting of research data, problematic statistical techniques, failure to properly document and preserve research results, and inappropriate authorship.” More here.

Update, September 19, 2018: The Journal of the American Medical Association has retracted six articles about studies that included Wansink’s work after Cornell University informed the editors, “We regret that, because we do not have access to the original data, we cannot assure you that the results of the studies are valid.”

Update, April 6, 2017: After an internal investigation of Wansink’s work, Cornell University found errors in Wansink’s work, but no misconduct, Retraction Watch reports.

In 2014, I profiled Brian Wansink, a behavioral psychologist who studies how our surroundings affect our eating habits. Wansink runs Cornell University’s Food and Brand Lab, a prolific group known for its clever dining research—one widely cited study, for example, found that people who keep their breakfast cereal in a cabinet weighed 21 pounds less on average than those who keep it on the counter; another showed that diners who sit near a restaurant’s entrance are 73 percent less likely to order dessert than those who sit in the restaurant’s interior.

“Social science isn’t definitive like chemistry,” Wansink told Retraction Watch. “Like Jim Morrison said, ‘People are strange.’ In a good way.”

I wasn’t the only one who thought Wansink’s work was cool. His research—some 200 studies since 2005—regularly makes headlines. But in January, a team of researchers reanalyzed the data from four of the Food and Brand Lab’s studies about pizza and turned up what appear to be serious problems: The researchers spotted 150 data inconsistencies. As Columbia University statistician Andrew Gelman put it in a blog post: “Although the four papers were all based on the same data, they differed in all sorts of detail, which suggested that the authors opportunistically used data exclusion, data coding, and data analysis choices to obtain publishable (that is, p less than .05) results.”

In a blog post on Thursday, one of the researchers, University of Groningen Ph.D. student Nick Brown, pointed to what appear to be several incidences of self-plagiarism in Wansink’s writing. Brown also found that the data from two of Wansink’s studies—one from 2001 and another from 2003 “appear to be almost identical, despite purportedly reporting the results of two completely different studies.”

Wansink declined to comment on the accusations. Instead, he pointed to a statement on the lab’s website, where he writes, “We are currently conducting a full review of studies in question, preparing comprehensive data which will be shared and establishing new standards for future operations at the lab which will include how we respond to requests for research information.”

The statement also notes that Wansink has enlisted a Food and Brand lab member who wasn’t involved in the studies to reanalyze the data in question. This move has raised some eyebrows in the scientific community: Why not hire an independent researcher? Here’s how Wansink answered that question in a Q&A with the scientific integrity watchdog blog Retraction Watch:

That’s a great question, and we thought a lot about that. In the end, we want to do this as quickly and accurately as possible—get the scripts written up, state the rationale (i.e., why we made particular choices in the original paper), and post it on a public website. Also, because this same researcher will also be deidentifying the data, it’s important to keep everything corralled together until all of this gets done.

But before we post the data and scripts, we also plan on getting some other statisticians to look at the papers and the scripts. These will most likely be stats profs who are at Cornell but not in my lab. We’ve already requested one addition to [the Institutional Review Board (IRB)], so that’s speeding ahead.

But even though someone in my lab is doing the analyses, like I said, we’re going to post the deidentified data, the analysis scripts (as in, how everyone is coded), tables, and log files. That way everyone knows exactly how it’s analyzed and they can rerun it on different stats programs, like SPSS or STATA or SAS, or whatever. It will be open to anyone. I’m also going to use this data for some stat analysis exercises into one of my courses. Yet another reason to get it up as fast as possible—before the course is over.

In the same Q&A, Wansink defended his work on methodological grounds. “These sorts of studies are either first steps, or sometimes they’re real-world demonstrations of existing lab findings,” he said. “They aren’t intended to be the first and last word about a social science issue. Social science isn’t definitive like chemistry. Like Jim Morrison said, ‘People are strange.’ In a good way.”

Cornell has declined to intervene. In a statement to New York magazine, John J. Carberry, the university’s head of media relations, wrote, “While Cornell encourages transparent responses to scientific critique, we respect our faculty’s role as independent investigators to determine the most appropriate response to such requests, absent claims of misconduct or data sharing agreements.”

I’ll be tracking this story, and we will post updates as they occur.

 

Thank you!

We didn't know what to expect when we told you we needed to raise $400,000 before our fiscal year closed on June 30, and we're thrilled to report that our incredible community of readers contributed some $415,000 to help us keep charging as hard as we can during this crazy year.

You just sent an incredible message: that quality journalism doesn't have to answer to advertisers, billionaires, or hedge funds; that newsrooms can eke out an existence thanks primarily to the generosity of its readers. That's so powerful. Especially during what's been called a "media extinction event" when those looking to make a profit from the news pull back, the Mother Jones community steps in.

The months and years ahead won't be easy. Far from it. But there's no one we'd rather face the big challenges with than you, our committed and passionate readers, and our team of fearless reporters who show up every day.

Thank you!

We didn't know what to expect when we told you we needed to raise $400,000 before our fiscal year closed on June 30, and we're thrilled to report that our incredible community of readers contributed some $415,000 to help us keep charging as hard as we can during this crazy year.

You just sent an incredible message: that quality journalism doesn't have to answer to advertisers, billionaires, or hedge funds; that newsrooms can eke out an existence thanks primarily to the generosity of its readers. That's so powerful. Especially during what's been called a "media extinction event" when those looking to make a profit from the news pull back, the Mother Jones community steps in.

The months and years ahead won't be easy. Far from it. But there's no one we'd rather face the big challenges with than you, our committed and passionate readers, and our team of fearless reporters who show up every day.

We Recommend

Latest

Sign up for our newsletters

Subscribe and we'll send Mother Jones straight to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate

We have a new comment system! We are now using Coral, from Vox Media, for comments on all new articles. We'd love your feedback.