If you've got the data, you have got to publish it.
July 5, 2016 9:42 PM   Subscribe

Early in his career cardiologist Peter Wilmshurst was offered several years salary to sit on bad data. He explains to Spiegel Online why he declined:
The patients that took part in our research did this on the expectation that their data would be used. In this study the participants had had cardiac catheterization. They had risked their lives for the study! So you cannot hush up the results!

Th interview covers a range of topics. His take on mentoring of young doctors is not flattering:
As a junior doctor what you see is that money is very, very important. Many doctors judge their value to society by how much they earn. Like bankers. If they haven't got enough to drive a Porsche, then they're not a very good doctor.
The Guardian has more from the libel suit mentioned in the interview, which lasted four years and threatened to bankrupt him.

Via Andrew Gelman.
posted by mark k (30 comments total) 36 users marked this as a favorite
 
FWIW I work in pharma, and can get riled up over the many unfair criticisms of the industry. But shameful behaviors like this, accurately described, are even more bile-inducing.
posted by mark k at 9:48 PM on July 5, 2016 [10 favorites]


SPIEGEL: Do you know other cases where things ended well for a whistleblower?

Wilmshurst: No, almost none.
So true.. So depressing..
posted by Nerd of the North at 11:32 PM on July 5, 2016 [4 favorites]


SPIEGEL: Do you know other cases where things ended well for a whistleblower?

Wilmshurst: No, almost none.
That's by design, not some random badness.
posted by el io at 11:48 PM on July 5, 2016 [10 favorites]


One of the worst parts is how this money makes people mistrust science and scientists. We're really trying to help improve things for everyone, and the morally vacuous corporate stooges who were somehow given science degrees somehow get to speak for all of us! It's beyond frustrating.
posted by a lungful of dragon at 11:51 PM on July 5, 2016 [8 favorites]


He's an excellent researcher as well as a person of great integrity and almost incredible strength of character.

Back in 2000, according to his Wikipedia article, he published his discovery that a patent foramen ovule could cause migraines, which I think is a marvelously deep insight.
posted by jamjam at 11:51 PM on July 5, 2016 [4 favorites]


"Honesty is the best policy."

The other thing with dishonestly is that it tends to form webs. Everyone knows dirt on everyone else. Squeal on someone, and you might find yourself being unraveled next. It takes a long history of integrity (or nerves of steel) to blow the whistle on someone.
posted by mantecol at 12:22 AM on July 6, 2016 [4 favorites]


If the person running your lab is crooked, you're toast; you either play along or get escorted off the property.

A coffeeshop buddy is a bit bitter after being escorted off the property, and decided to go get a linguistics degree because his future job prospects in the field were toast.

In completely unrelated news, the Co-Editor in Chief of Skeletal Muscle and Canada Research Chair in Molecular Genetics had a paper retracted from the journal he edits for blots “from unrelated samples”.
posted by sebastienbailard at 1:08 AM on July 6, 2016 [1 favorite]


The part I find really surprising is the idea that you might be paid a bonus for a positive result:

[Y[ou may put into the contract arrangements for a bonus in case of a positive result. (grins) This may help the researcher interpret the data in a more favorable way for you.

I genuinely don't understand why anyone would sign a contract like that unless (a) they are actively corrupt (b) in dire financial straits or (c) are mind-bogglingly stupid. There's no way in hell I would trust myself to design a study or analyse the results if I knew I had a financial stake in a particular outcome. It's hard enough to ignore publication bias and file drawer effects as it is, but adding actual cash bonuses into the mix? This is madness.
posted by langtonsant at 1:55 AM on July 6, 2016 [8 favorites]


I assume that the vast majority of people who accept such contracts are actively corrupt. However, researchers also have a financial incentive to produce negative results, as they might then be offered bribes to abstain from publishing those results (as the interviewee was). So even if one is presented with a contract which includes a bonus clause, one may be unsure whether it is in one's financial interest to obtain a positive or negative result. One may therefore accept the contract with a clear conscience.
posted by Abelian Grape at 2:09 AM on July 6, 2016 [4 favorites]


...and for the Nth time today, I am reminded that I am a terribly naive person. Even after reading the article that possibility would never have occurred to me. I am so glad that my university employs lawyers who actually do think about these things.
posted by langtonsant at 2:21 AM on July 6, 2016 [1 favorite]


SPIEGEL: Do you know other cases where things ended well for a whistleblower?

Wilmshurst: No, almost none.
We had an academic case of whistle-blowing that turned out extraordinarily well for the whistleblowers turn up on the blue not that long ago. Then-graduate students, David Broockman and Joshua Kalla got a paper in Science for their efforts and are now starting promising academic careers that were clearly helped rather than hindered by the clear, beautiful, and effective takedown that they authored. I don't know about the context that doctors in research work in, but these guys had their entire field scrambling to laud be associated with them once it became clear they were right.
posted by Blasdelb at 3:02 AM on July 6, 2016 [4 favorites]


Ben Goldacre of "Bad Science" is part of a wide campaign to make it compulsory to publish all trial results. The mistakes that have been made, the lives that will have been lost or shortened, due to failure to publish trials that are considered not positive enough for those funding them, are shocking.
posted by greenish at 4:11 AM on July 6, 2016 [11 favorites]


> I genuinely don't understand why anyone would sign a contract like that unless (a) they are actively corrupt (b) in dire financial straits or (c) are mind-bogglingly stupid.

If you want to be an academic researcher, you want to be in an R1 school. If you want to be a tenured researcher at an R1 school these days you have to prove that your work can be sufficiently funded through research grants and intellectual property income. The system has devolved to favor the corruptable.
posted by at by at 4:12 AM on July 6, 2016 [10 favorites]


A coffeeshop buddy is a bit bitter after being escorted off the property, and decided to go get a linguistics degree because his future job prospects in the field were toast.

It's all going to seem like a great story, once he starts banking that fat linguistics paycheck.
posted by thelonius at 4:16 AM on July 6, 2016 [9 favorites]


> I genuinely don't understand why anyone would sign a contract like that unless (a) they are actively corrupt (b) in dire financial straits or (c) are mind-bogglingly stupid.

>If you want to be an academic researcher, you want to be in an R1 school. If you want to be a tenured researcher at an R1 school these days you have to prove that your work can be sufficiently funded through research grants and intellectual property income. The system has devolved to favor the corruptable.


In other words, Because Capitalism.
posted by Thorzdad at 6:01 AM on July 6, 2016 [6 favorites]


The system has devolved to favor the corruptable.

Of course, lots of research is funded through government grants -- the NIH in the US, for example.

But this funding is constantly under threat. People want scientists doing research that isn't funded by corporate interests, because of the potential for corruption, but they don't want to pay for it.

The bonus clause is interesting to me, because when I submit an IRB application, I have to disclose any financial conflict of interest. I've never checked that box (no industry cares what I do). It's depressing that even though everyone is aware of the problem, some scientists still receive pay based on their results.

Like, it's a huge, ongoing conversation that the way academic research is rewarded already fosters corruption - people don't like it, but it's very hard to change without completely overhauling the system. To make pay explicitly dependent is mind-boggling to me.

that fat linguistics paycheck

*cries*
posted by Kutsuwamushi at 6:02 AM on July 6, 2016 [5 favorites]


Peter Wilmshurst was offered several years salary to sit on bad data good data that showed negative results.
posted by tempestuoso at 6:14 AM on July 6, 2016 [12 favorites]


There's no way in hell I would trust myself to design a study or analyse the results if I knew I had a financial stake in a particular outcome. It's hard enough to ignore publication bias and file drawer effects as it is, but adding actual cash bonuses into the mix? This is madness.

This is merely tipping. Research is already setup with extremely powerful incentives favoring the production of positive results. The real corruption and madness happens well before the bonus. Maybe even before the researchers get their first tenure track job.
posted by srboisvert at 6:18 AM on July 6, 2016 [2 favorites]


Every time a story like this gets published my friends in research pass it around advertising their availability for high priced corruption. Two years salary? Hell yes!

I'm pretty sure they're joking.
posted by Tell Me No Lies at 7:08 AM on July 6, 2016 [1 favorite]


You'd think academia would have a similar Code of Ethics.

I think that it does; see my above comment about the IRB. That is not to say that it's being enforced.

But yes, as srboisvert says, the conflict of interest starts before we even begin talking about money. Your success as a researcher depends on publishing positive results. And it's not just the ambitious or greedy who are affected - it's literally up or out. The cost of honesty might be losing your career.

We talk about these things; everyone I know agrees there's a problem. But fixing it requires making drastic changes to how academic research is evaluated - and no one knows how to do that. So we rely on scientists to be honest despite these pressures.

Honestly I'm surprised there's not more corruption.
posted by Kutsuwamushi at 7:20 AM on July 6, 2016 [2 favorites]


I genuinely don't understand why anyone would sign a contract like that unless (a) they are actively corrupt (b) in dire financial straits or (c) are mind-bogglingly stupid.

I don't think most people are actively corrupt, at least in the plain sense of that phrase. It's more insidious. If anything I think most people are confident in their own integrity, as well as the objective nature of science, to think that the perverse incentive is not going to influence them. This means entering into a contract where the other party give you more if the collaboration succeeds (not really that unusual) can be justified and then becomes normal. And it makes it easier to take more steps down that road of course.

There's no way in hell I would trust myself to design a study or analyse the results if I knew I had a financial stake in a particular outcome.

Assuming you design studies currently then you already have a financial stake in the outcome. You don't build a career out of a series of studies that spend a lot of money and fail to detect something. This is why there's a reproducibility problem in academic research as well.
posted by mark k at 7:21 AM on July 6, 2016 [2 favorites]


I'm the person at my university who gets called when that little "do you have a conflict of interest?" box gets checked. Which means that this conversation is a) fascinating to me and b) probably not something I can participate in much, to my grave sorrow, 'cause I try not to drag my work life too far onto Metafilter.

I can say that while I have never seen a contract tying specific financial reward to positive research outcome (holy shit, would my university's legal counsel never sign off on that), mark k is echoing a general sense I've certainly heard from many a researcher. The "sure, I realize that this would be problematic if someone else did it. But I'm a good person and I know that I am not going to be swayed by [whatever situation] so it's no problem!"

I'm getting very good at helping to explain to people how "but I'm a good person!" isn't a defense if you or your science are being discredited because of an undisclosed and unmanaged conflict, but it's still startling to me how often I need to do that.

(Public service announcement to researchers: Stuff you know about yourself, in your head, is not in and of itself an adequate conflict of interest management strategy. If you think for any reason you should maybe check the yes-I-have-a-COI box but you're not sure? Check the box. Or find your university's version of me and give her a call. She exists to help protect you and your science, and she's probably screwing around on Metafilter right now and needs to get back to work anyway.)
posted by Stacey at 8:05 AM on July 6, 2016 [14 favorites]


One of the other problems is the sunk cost of training. If you're an MD, you can afford to piss off a drug company -- worst case you'll go back to treating patients. If you're an academic PhD who has already sunk 13+ years into your schooling, then get blacklisted before you get an industry job or tenure, you'll have a difficult time transitioning to another reasonable profession.
posted by benzenedream at 8:15 AM on July 6, 2016


This puts me in mind of the inversion of morality that happens when things fall apart. The times when scruples or ethics or morality become handicaps or signs of weakness, things to be disparaged, while atavistic cynicism is elevated and gangsters become heroes. Like when Yugoslavia fell apart.
posted by Pembquist at 9:00 AM on July 6, 2016 [2 favorites]


If you're an academic PhD who has already sunk 13+ years into your schooling, then get blacklisted before you get an industry job or tenure, you'll have a difficult time transitioning to another reasonable profession.

This also applies to getting caught conflicting your interests too egregiously. Which is at least partially why there appears to be less corruption in academia than one would expect.

There is also a serious element of Mutually Assured Destruction in academia self-policing.
posted by srboisvert at 9:09 AM on July 6, 2016 [1 favorite]


Reading these sorts of articles gives me some very painful flashbacks to when I was asked to basically flub data by my boss. Well, or "to find positive results" where there weren't any. I could have blown the whistle but they could have just fired me for "incompetence" rather than actually deal with the dozens of nonsensical datasets and forged reports with no data to speak of that I uncovered. I held my moral ground, was reprimanded, and fell into a dark depression after this.
posted by Young Kullervo at 12:10 PM on July 6, 2016 [1 favorite]


The slippery slope he talks about is why I give the side eye when people complain about safety training, or IRB, or IACUC, or EHS requirements. Like, why do you think you get to not have to fill out this paperwork just because you're busy and doing good work and you wouldn't violate any regulations anyway? It makes me think you are a bad actor.

Regulations and codes of conduct are important! For God's sake, science can kill you, or the people who receive the results of your fake trial! Or do you think it's just fine to put your acetone down the drain? Grrrrrrrrrr
posted by Made of Star Stuff at 8:49 PM on July 6, 2016 [1 favorite]


BentFranklin: "You'd think academia would have a similar Code of Ethics."

In Australia at least it does. The Australian Code for the Responsible Conduct of Research is very frequently referenced in various research policies and processes adopted by universities. Pretty much every aspect of the situation described in the OP is covered by it. Section 7 obligates researchers to "withdraw" from situations that present a conflict of interest (rules out financial bonuses for specific results) and requires institutions to have CoI management policies, Section 4.4 specifically obligates researchers to report "all" findings (rules out the selective disclosure that the company wanted), etc etc. Even though it's not strictly binding, the Code is pretty much required reading if you're an academic in Australia because every university is "guided" by it, and much of the policy documentation is best understood in relation to the Code. The universities I've worked at all have extensive institutional support for this - they have people like Stacey to discuss CoI matters, as well as IP lawyers to handle potential disputes, statistical consultants to discuss the merits of an analysis plan, etc. Is that not the case elsewhere?
posted by langtonsant at 11:02 PM on July 6, 2016 [1 favorite]


They exist, but pressure to publish positive results forces people to gloss over or ignore them entirely.
posted by Young Kullervo at 7:33 AM on July 7, 2016 [1 favorite]


About 20 years ago I participated in an asthma research trial and my reaction to the medication was the complete opposite to what was expected. I remember the researchers telling me that my case would have to be reported as an exception. Being a shy insecure 18 year old, I was very apologetic thinking my weird lungs had wrecked their project but they were all no, no, it's really important that we got this result and even more important that it's reported.

It wasn't for years that I realised they'd taught me that being wrong is fine but you need to be honest about it.
posted by kitten magic at 9:16 PM on July 7, 2016


« Older The Curb at Rose and Prospect   |   Making Robots Dance Without Dancing the Robot Newer »


This thread has been archived and is closed to new comments