Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« On abusive analogy | Main | Mapping the debate - Josh 165 »
Sunday
May062012

Reproducibility

Here's something interesting, from the Chronicle of Higher Education:

If you’re a psychologist, the news has to make you a little nervous—particularly if you’re a psychologist who published an article in 2008 in any of these three journals: Psychological Science, the Journal of Personality and Social Psychology, or the Journal of Experimental Psychology: Learning, Memory, and Cognition.

Because, if you did, someone is going to check your work. A group of researchers have already begun what they’ve dubbed the Reproducibility Project, which aims to replicate every study from those three journals for that one year. The project is part of Open Science Framework, a group interested in scientific values, and its stated mission is to “estimate the reproducibility of a sample of studies from the scientific literature.” This is a more polite way of saying “We want to see how much of what gets published turns out to be bunk.”

I don't think readers will be surprised if I suggest that it's not just psychology that is worth checking in this way.

(H/T Alan, by email)

PrintView Printer Friendly Version

Reader Comments (16)

another useful quote
'Recently, a scientist named C. Glenn Begley attempted to replicate 53 cancer studies he deemed landmark publications. He could only replicate six. Six!'

Given that result for cancer, what hope for climatology.

May 6, 2012 at 8:35 AM | Unregistered Commenterconfused

Love to see the results for the climate scientologists work

May 6, 2012 at 9:24 AM | Registered Commentermangochutney

In theory they should already have been checked so it's not unfair to set an exceptionally high pass bar.

May 6, 2012 at 9:50 AM | Unregistered CommenterDavid

Why Most Published Research Findings Are False
John P. A. Ioannidis

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/?tool=pubmed

May 6, 2012 at 10:02 AM | Unregistered CommenterWill Nitschke

Unlike "peer review", replication is the absolute core of science.

But the interesting question is what kind of "replication" is needed? It is one thing to conduct the same piece of research using the same (described) methods and get the same answers - it is a different thing to ask whether the methods themselves are valid and the results meaningful.

To get close to our fine Bishops' heart here, consider a replication of Mann's original 1998 hockey stick paper. Let's leave aside the small problem of the methods being inadequately described and the data sets used either ill described or unavailable. If a researcher did follow the techniques described, using the same data, then the results would likely be the same. "Replication successful", right?

Well, er, no... The methodology was erroneous. The data used both cherry picked and of dubious meaning. Replicated and still a pile of nonsense.

Of course the curious thing is that Mann himself and a series of other researchers have themselves done a series of "replications" since 1998 - loudly trumpeted as supporting Mann's original work. The fact that these replications used some combination of the poor methodology and the poor data of the original, makes them good replications of precisely the same kind of dismal science. Some of these replications are even comedic in their twisting of data to meet preconceived ideas - such as the famous treatment of the Tiljander sediments.

The Chronicle of Higher Education article leaves open the question of how replication will be done - but it is a notorious problem in the area of Medical Science and in the case of drug testing it has led to all kinds of complex protocols, which still are not good enough in weeding out all the poor research, although they are a good start...

How, for example, would anyone go about replicating a paper that is based on the results of running some suitably complex global climate model? If the code of the model is available, re-run the model again? Or would it require building of a new model?

May 6, 2012 at 10:08 AM | Unregistered CommenterMike Edwards

I have a little different perspective about the need to reproduce other people's experiments. There is a lot of stuff published these days but how much of it is important in that it becomes the basis or foundation of future work? If the foundation is weak, what gets built upon it won't be sound. Scientists publish for a variety of reasons, including advancing their fields, career advancement, and obtaining funding for folow-on work. In other words, these scientists will likely be living in the house they build. Reproducibility is important but good science is self correcting -- eventually. Irreproducale results won't support the follow on work. The best way to get away with shoddy work is to keep it obscure. But you won't build a career or much funding on that.

May 6, 2012 at 11:07 AM | Unregistered CommenterSean

I once went to examine a PhD thesis Somewhere in England. I found that the ideas seemed to have been plagiarised from a thesis by one of my own students. Our work stood up rather well to the replication, except that the copyist hadn't done it half so well. It only goes to show.

May 6, 2012 at 11:39 AM | Unregistered Commenterdearieme

Sean - I know you are right in the world of proper science.

I am not sure you are right about Climate Science, where all you needed was to buy in to the right story. The world has been flush with funding for the AGW meme and much shoddy work has delivered, a well paid career and continued funding on the back of it.

In fact some have admitted doubts about AGW, but know that applying for some funding to research even a mildly sceptical point of view will lead to no food on the table. That situation is very corrupting.

May 6, 2012 at 12:43 PM | Registered Commenterretireddave

Climate science will be corrected. It's just that there are so many natural climate cycles, some lasting decades, others longer. You can look real smart for a decade or two then not so much. The cooling scare in the 70's came on at the END of a cooling cycle just as the warming scare came along at the turn of the millennia at the end of a warming period. I guess people like to draw straight lines and claim they can predict the future. And that's the rub, making predictions which is what science is all about. The public has already tired of the science and the bills are starting to come due for climate mitigation strategies. Mother nature is also hard at work doing her experimental validations and she will show who the charlatans are.

May 6, 2012 at 1:45 PM | Unregistered CommenterSean

In regard to the post by author “confused”, s/he refers to an article titled, “Junk science? Most preclinical cancer studies don't replicate.” Link: http://tinyurl.com/82hxkym

May 6, 2012 at 2:16 PM | Unregistered Commenterdonobo

"how much of it is important in that it becomes the basis or foundation of future work? If the foundation is weak, what gets built upon it won't be sound....scientists will likely be living in the house they build. Reproducibility is important but good science is self correcting -- eventually. Irreproducale results won't support the follow on work. "

" C. Glenn Begley attempted to replicate 53 cancer studies he deemed landmark publications. He could only replicate six."

As a cancer patient I don't necessary want to be living in the house designed and built by scientists, not themselves patients, with a shoddy foundation.

As a taxpayer I don't necessary want to be paying into a government catastrophe-avoidance program, and curtailing my lifestyle choices, where the catastrophe research was designed-and-built according to the standards applicable to 47 of 53 cancer research studies -- especially when the leadership of the catastrophe affirmers seem to avoid taxes and have an even more lavish lifestyle than mine.

All that said, I do tend to agree that sooner or later shoddy work is revealed. Often, catastrophically. I just wish the climate work would reveal itself soon, and in non-catastrophic fashion.

May 6, 2012 at 7:41 PM | Unregistered Commenterpouncer

Reproducibility also implies that experimentation is possible to begin with, which is seldom the case in climate science. Papers in climate science fields often focus on an analysis of data sets combined with arguments that are circumstantial in nature. That doesn't necessarily mean you are doing bad science. But it does create confusion in the minds of the public. To them one peer reviewed study is approximately equivalent to any other. It does not matter that one might be based on replicated experimentation and another is based on an interesting hypothesis that one has no practical way to test.

May 7, 2012 at 3:38 AM | Unregistered CommenterWill Nitschke

I agree with Mike Edwards. Replication is better than nothing, but psychology research is riddled with bad methodology. Tiny or unrepresentative samples, biased survey questions, dodgy use of statistics - you name it, it abounds in that field.

If they don't cover validation as well as replication, we may not see much progress via this work.

May 7, 2012 at 4:40 AM | Unregistered Commenterjohanna

There could be thousands of man-hours of work available to scientist checking on the replication of the slabs
of dubious 'science of climate' published over the last quarter Century... no not thousands, ....Millions. Replication is the core of science.

May 7, 2012 at 8:54 AM | Unregistered Commenterntesdorf

If "replication is the core of science" how does one "replicate" ideas concerning the big bang and other cosmological theories, models of the internals of stars, evolutionary theory, palaeontology, and so on? Even much of medicine and psychology and psychiatry cannot be experimental due to ethical and moral concerns.

Therefore that might be a somewhat simplistic way of looking at science. However, if you qualified your claim to "replication is the core of experimental science" then I would agree with you. But not all science can be experimental in form.

May 7, 2012 at 12:06 PM | Unregistered CommenterWill Nitschke

Will,

In science, theories are not replicated - they are tested or proved by means of experiments or observations. In one sense, theories are two a penny (I don't intend to slight the work of theorists in any way - some of these people are the greatest geniuses the world has ever seen), a theory has to earn its salt by making predictions which are then tested by experiment or observation. And it is usually necessary for those experiments to be replicated before the theories to gain credibility - one observation or one experiment is never enough. Without predictions that can be tested, theories sit in a kind of never-land - witness string theory at present: an elegant theory unable to make clear predictions that can be put to the test.

It is interesting to observe the reaction to an observation or experimental result that goes against a prevailing theory - in most cases it is assumed that the experiment is wrong and the call goes out for the result to be reproduced or replicated. Only when repeated observations pile up against the theory do new hypotheses get created - witness the "dark matter" hypothesis to explain the observed anomalous motions of stars which challenge the standard theory of gravitation.

You ask how replication is possible in a science like climate science which is not experimental. Well, for such sciences, there are observations - measurements of the system under study. What then needs to be done is that all such basic observational data should be made publicly available and any interpretations based on those observations should have clear methods described. Both the data and the methods can be subject to alternative analysis - these represent attempts to reproduce the initial findings - or which may show up problems.

May 9, 2012 at 10:35 PM | Unregistered CommenterMike Edwards

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>