How A Budget Squeeze Can Lead To Sloppy Science And Even Cheating

Apr 14, 2017
Originally published on April 14, 2017 3:44 pm

A funding crunch for scientific research is creating incentives for scientists to cut corners and even occasionally to cheat.

This is one of the findings in a new report about scientific integrity from the National Academies of Sciences, Engineering and Medicine.

Sometimes scientists adopt sloppy practices that can lead to false conclusions. This can hamper progress in science. And taxpayer dollars are on the line.

Consider the story of a genetics lab at the University of Wisconsin. Mary Allen was a graduate student in that lab in 2005. One postdoctoral researcher had been laid off because of a funding shortage, and the professor in charge of the lab was scrambling to keep the laboratory afloat by seeking more grants.

But Allen and her five fellow graduate students noticed that a grant document didn't accurately describe work that had been previously done in the lab.

"We weren't certain it was falsification," Allen says. "It could have been a mistake. The results sounded slightly better than they really were."

Allen and her fellow graduate students faced a difficult decision.

"If it's really falsification, we may not have the ability to keep going in grad school, or they may ask us to go to another lab and start new projects," she says. "Either way, that would be a huge hit to everybody's career."

The graduate students decided to talk to the department chairman about the issues they'd found. Ultimately the professor quit, and later pleaded guilty to scientific misconduct. This story was first reported by Science magazine in 2010.

The students in the lab liked their professor and thought she was doing good work. So why did the professor tinker with the grant report in the first place?

"I think one of the reasons she did it was she was under so much stress about getting funding for the students," Allen says. So, "she decided tweaking the data a little to make it look better would allow her to get a grant and therefore fund us."

The former professor did not respond to NPR's request for comment.

Stories of outright misconduct like this are rare in science. But the pressures on scientists manifest in many more subtle ways.

If people are working as hard as they can and as smart as they can, they may look for other ways to get a further edge to succeed in their careers, says social scientist Brian Martinson at the HealthPartners Institute in Minneapolis.

"Some proportion of people might find themselves making bad decisions and cutting corners," he says.

Martinson has surveyed university scientists and asked them about behaviors that he calls undesirable. These can include poor data handling, not keeping tight control of patient privacy and bending other rules.

"Almost half of the scientists who responded to our survey said they had engaged in at least one of those activities in the prior three years," Martinson says. And many said they had violated multiple standards.

"When you get people engaging in that many kinds of consistent, undesirable practices, this can certainly undermine the quality of the work," he says, "and therefore the ability to reproduce it."

Many studies that get published in the biomedical literature can't be reproduced in other labs. This slows progress in medical research, because scientists spend a lot of time chasing down false leads. That hampers the search for understanding disease and seeking treatments.

Some of this is unavoidable, simply because scientists are exploring the edges of knowledge. But there's plenty of room for improvement.

"If you've got people who are cutting corners, if you've got people who are doing things to undermine the quality of research, you've got to ask why," Martinson says.

Sometimes scientists simply don't know better. Occasionally scientists willingly cheat. But often these behaviors are driven by bad incentives in the system.

"I think what we're really talking about here is human nature," says C.K. Gunsalus, director of the National Center for Professional and Research Ethics at the University of Illinois. She and Martinson both served on the National Academies' committee on research integrity.

"If you're in an environment that has very high stakes and very low chance of success, those are two of the predictors of environments in which people are going to cheat," Gunsalus says.

That's exactly the environment where many scientists find themselves today. There are strong career incentives to bend the rules, by exaggerating accomplishments in a grant proposal, for example.

"One of the rules in life is if you reward bad conduct you're going to get a lot more bad conduct," Gunsalus says, "because even people with Ph.D.s can figure out what you're rewarding and say, 'Ooh! If that's what it takes to get ahead, I can do that.' "

But if scientists see everyone else playing by the rules, they are more likely to do so as well. That's why Gunsalus, who swoops into troubled academic departments to fix dysfunction, looks to see whether the leaders are setting a good example. If they are, others are likely to follow.

"People do fundamentally care about the rigor and integrity of research because that's how progress happens," she says. "I mean, you can't scam the facts or nature, right?"

And in addition to scientific progress and tax dollars, careers are at stake here.

Mary Allen says only three of the six grad students in her uprooted lab ended up getting Ph.D.s, despite the many years all of them had put in. It took her 8 1/2 years to complete hers.

Allen recently got a job as a research assistant professor at the University of Colorado, Boulder. So now she finds herself in the same position as her former professor — on the quest for scarce grant funding.

In 2005, when the misconduct took place, "it was the worst funding NIH had seen," Allen says, "and we've only seen it go downhill. So it's even worse than it was before."

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

ARI SHAPIRO, HOST:

Federal money for science has been shrinking. Under the Trump administration's budget proposal, it could be squeezed even further. And when scientists are under financial pressure, they sometimes cut corners in ways that can hurt us all. NPR's Richard Harris reports.

RICHARD HARRIS, BYLINE: Mary Allen was a graduate student in a lab at the University of Wisconsin that was facing financial pressure. The year was 2005. One post-doctoral researcher was laid off, and the professor in charge of the lab was scrambling for more funding. But Allen and her fellow grad students saw something that looked fishy in a grant document.

MARY ALLEN: At the time it was discovered, we weren't certain it was falsification. It could have been a mistake.

HARRIS: The grant report didn't accurately describe work that had been previously done in the lab.

ALLEN: And the results sounded slightly better than they really were.

HARRIS: Allen and her fellow graduate students faced a difficult decision.

ALLEN: If it's really falsification, we may not have the ability to keep going in grad school, or they may ask us to go to a different lab and start new projects. Either way, that would be a huge hit to everybody's career.

HARRIS: Long story short, the professor quit and later pleaded guilty to scientific misconduct. Allen liked her professor and thought she was a good scientist. The professor told her students she fudged the report to keep her lab going.

ALLEN: So I think one of the reasons she did it was that she was under so much stress about getting funding for the students and being afraid she wouldn't be able to fund us that she decided tweaking the data a little to make it look better would allow her to get a grant and therefore fund us.

HARRIS: Outright misconduct and fraud are apparently rare in science, says social scientist Brian Martinson at the HealthPartners Institute in Minneapolis. But the pressures on scientists appear frequently in many more subtle ways.

BRIAN MARTINSON: If people are working as hard as they can, as many hours as they can physically put in and they're working as smart as they can, then, if they need to get a further edge to try to succeed in their careers, some proportion of people might find themselves making bad decisions and cutting corners.

HARRIS: Martinson has surveyed university scientists and asked them about behaviors he calls undesirable, such as whether they skipped some rules and regulations about data handling, patient privacy, record keeping and so on.

MARTINSON: Almost half of the scientists who had responded to our survey said to us that they had engaged in at least one of those activities in the prior three years.

HARRIS: And many said they had violated multiple standards.

MARTINSON: When you get people engaging in that many undesirable practices, you go, this can certainly undermine the quality of the work that's coming out of it and therefore the ability to reproduce it.

HARRIS: And those faulty studies can slow progress as medical scientists work to understand the causes of disease and search for treatments. So it's having an invisible effect on all of us.

MARTINSON: If you've got people who are cutting corners, if you've got people who are doing things that are undermining the quality of research, you have to ask yourself why.

C K GUNSALUS: I think that what we're really talking about here is human nature.

HARRIS: C.K. Gunsalus Gonzales runs the National Center for professional and research ethics at the University of Illinois. She and Martinson were both on a National Academy of Sciences committee on scientific integrity which released a report this week.

GUNSALUS: If you're in an environment that has very high stakes and very low chance of success, those are two of the predictors of environments in which people are going to cheat.

HARRIS: And that's exactly the environment where scientists find themselves today. There are strong career incentives to bend the rules, say, by exaggerating accomplishments in a grant proposal.

GUNSALUS: One of the rules in life is that if you reward bad conduct, you're going to get a lot more bad conduct because even people with Ph.D.s can figure out what you're rewarding and, say, oh, if that's what it takes to get ahead, I can do that.

HARRIS: But if scientists see everyone else playing by the rules, they are more likely to do so as well. That's why Gunsalus, who swoops into troubled academic departments to fix dysfunction, looks to see whether the leaders are setting a good example. If they are, others are likely to follow.

GUNSALUS: People do fundamentally care about the rigor and integrity of research because that's how progress happens. I mean you can't really scam the facts or nature, right?

HARRIS: And it's not just tax dollars and scientific progress at stake. Careers matter, too. Mary Allen says only 3 of the 6 grad students in her up-rooted lab ended up getting Ph.D.s. It took her eight and a half years. She just recently got a job as a research assistant professor at the University of Colorado Boulder. So now she finds herself in the same position as her former professor - on the quest for scarce grant funding.

ALLEN: It was the worst funding NIH had seen, and we've only seen it go downhill. So (laughter) it's even worse than it was before.

HARRIS: And she knows full well what is at stake. Richard Harris, NPR News. Transcript provided by NPR, Copyright NPR.