Loading...
Please wait, while we are loading the content...
Similar Documents
The problem with data
| Content Provider | Semantic Scholar |
|---|---|
| Author | Aarssen, Lonnie W. |
| Copyright Year | 2015 |
| Abstract | The progress of science requires inspiration. Some researchers find this only from data. "Show me the evidence", they say. Many peer-reviewed publications in science, however, have no data. They involve a different kind of inspiration: proposals for original ideas or new hypothesis development. These are found within the 'Forum', 'Perspectives', 'Opinion' and 'Commentary' sections of many journals, and in some journals, like IEE, devoted entirely to new ideas and commentary. I have always been particularly drawn to the honesty and beauty in this creative brand of enquiry. And so I am puzzled to hear it often dismissed out of hand with pejorative labeling, like ‘hand-waving’ and ‘just-sostories’. Many—especially among the elites and selfappointed guardians of established theory—would have us believe that only ‘evidence-based’ practice and products can be taken seriously as legitimate sources of inspiration and discovery. This is plainly arrogant and wrongheaded. The scientific method means doing whatever is necessary to get good answers to questions worth asking. And so data collection that is not guided by interesting, novel, and important ideas is usually boring at best. At worst, it is a waste of research grant funds. But published data are plagued with an even more serious problem: we never know how much to trust them. A few minutes of Google searching under the terms "research bias", "scientific misconduct", "publication bias", and “retractions” shows that the follies of faith in published data have come sharply and painfully into the public spotlight in recent years. The latest bad news is particularly troubling: most published studies are not reproducible (Baker 2015, Bartlett 2015, Begley et al 2015, Jump 2015). The statistical implication from this is unavoidable: it means that the results of at least half of all empirical research that has ever been published, probably in all fields of study, are inconclusive at best. They may be reliable and useful, but maybe not. Mounting evidence in fact leans toward the latter (Ioannidis 2005, Lehrer 2010,Hayden 2013). Moreover, these inconclusive reports, I suspect, are likely to involve mostly those that had been regarded as especially promising contributions—lauded as particularly novel and ground-breaking. In contrast, the smaller group that passed the reproducibility test is likely to involve mostly esoteric research that few people care about, or so-called ‘safe research’: studies that report merely confirmatory results, designed to generate data that were already categorically expected, i.e. studies that aimed to provide just another example of support for well-established theory—or if not the latter, support for something that was already an obvious bet or easily believable anyway, even without data collection (or theory). A study that anticipates only positive results in advance is pointless. There is no reason for doing the science in the first place; it just confirms what one already knows must be true. This probably accounts for why the majority of published research remains uncited in the literature—or virtually so, attracting only a small handful of citations, many (or most) of which are selfcitations (Bauerlein et al. 2010). Are there any remedies for this reproducibility problem? Undoubtedly some, and researchers are scrambling, ramping up efforts to identify them [see Nature Special (2015) on Challenges in Irreproducible Research, http://www.nature.com/news/reproducibility1.17552]. Addressing them effectively (if it is possible at all) will require nothing short of a complete restructuring of the culture of science, with new and revised manuals of ‘best practice’ (e.g. see Nosek et al. |
| File Format | PDF HTM / HTML |
| DOI | 10.4033/iee.2015.8.15.e |
| Volume Number | 8 |
| Alternate Webpage(s) | https://ojs.library.queensu.ca/index.php/IEE/article/download/6076/5748 |
| Alternate Webpage(s) | https://doi.org/10.4033/iee.2015.8.15.e |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |