Followup to our discussion of Simonsohn’s fraud paper

Here are a few links following up on things that came up in our discussion today.

1. For those of you who are interested in learning more about the Diederik Stapel fraud case, a collection of links to commentaries about it.

2. Regarding Dirk Smeesters, I mentioned that one of his collaborators posted a lengthy comment about the case, including his involvement and some pretty raw emotions, when it was first revealed at Retraction Watch. (See also here at Wired.)

4. Some interesting parallels between Simonsohn’s work and that of an earlier fraud detective, NIH biologist Walter Stewart.

*****

The paper we discussed today dealt with detecting outright fraud — fabrication of data out of nothing. But some other issues came up today related to other aspects of research integrity and quality…

5. I mentioned today that there’s been some work on the problem of improbable strings of successful replications in multi-study papers, which can be interpreted as evidence of publication bias. Some important statistical work on this subject was done by Ioannidis and Trikalinos. Lately Greg Francis has been applying these methods to papers in psychology — see the “Statistics” section of his publications page (but note also that Francis’s use of the I&T method has generated some controversy; most of his papers have generated rebuttals, and Simonsohn wrote a critique of how he is using the method). But I think an absolutely terrific paper on this issue is an in-press paper by Uli Schimmack titled The Ironic Effect of Significant Results on the Credibility of Multiple-Study Articles. (That paper might be a good one to read for a future meeting.)

6. More broadly on the issue of cutting corners and questionable research practices, there are many, many resources out there. Here are a few recent papers dealing with these issues:

– An article on “False-Positive Psychology” that argues that problematic practices can lead to “researcher degrees of freedom” that make it possible to get almost any result. Simonsohn was one of the authors of that paper. (See also this discussion by the Psych Science editorial board on whether to implement their recommendations and some similar recommendations by Russ Poldrack about flexibility in fMRI analysis.)

– A survey of the prevalence of various questionable research practices.

– An analysis of Daryl Bem’s ESP paper that guesses about some of the ways he could have nudged his data (and speculates that many of them may be commonly used).

 

Leave a Reply

Your email address will not be published. Required fields are marked *