Enter your Email:
Preview | Powered by FeedBlitz

« nothing beats something | Main | the proof of the pudding is not, it turns out, in the eating »



This all seems pretty predictable. E.g. see Meehl:


“Perhaps the easiest way to convince yourself is by scanning the literature of soft psychology over the last 30 years and noticing what happens to theories. Most of them suffer the fate that General MacArthur ascribed to old generals—They never die, they just slowly fade away.


But in fields like personology and social psychology, this seems not to happen. There is a period of enthusiasm about a new theory, a period of attempted application to several fact domains, a period of disillusionment as the negative data come in, a growing bafflement about inconsistent and unreplicable empirical results, multiple resort to ad hoc excuses, and then finally people just sort of lose interest in the thing and pursue other endeavors.”

Like i wrote almost 2 years ago (https://groups.google.com/forum/#!topic/openscienceframework/JtabKEvqE44) RRR's may reset the system, but then start the same cycle again. They may even (unintentionally i presume) exacerbate this process.

I tried to think about how to stop the devastating process Meehl described and provided a short research- and publication format in a pdf file in the Open Science Framework discussion group-post linked to above.

I also used power-posing as an example of what Meehl may have written about, and wrote down and linked to my best solution here:


Tony Freitas

This is very helpful background information (e.g., it appears to indicate that *none* of the reviewers who influenced the decision had read the pre-registration). We're discussing this paper today in a graduate seminar, so this review will be very helpful to share with the class.


Mr. Freitas wrote:

"This is very helpful background information (e.g., it appears to indicate that *none* of the reviewers who influenced the decision had read the pre-registration)."

Perhaps mr. Freitas (or somebody else) can suggest that reviewers have to tick some box indicating that they read and verified the pre-registration information.

This seems pretty important and useful (also compared to other boxes reviewers might have to tick).

I find it hard to imagine reviewers don't/didn't have to tick a "checked and verified the pre-registration"-box, but the comment by mr. Freitas above makes me wonder...

Side note: shouldn't such a box be a part of every journal now pre-registration is (becoming) a "thing"?

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Comments are moderated, and will not appear until the author has approved them.

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)