Clinical trials directly influence practical and policy decisions and are crucial in determining the effectiveness of treatments. However, their results could be even detrimental to real-life patients if data is fabricated or subject to errors.
While it is about 2% of all researchers that admit to having manipulated their data, a new Dutch Fulbright project proposal, published via the innovative Research Ideas & Outcomes (RIO) Journal
, suggests new methods to tackle these issues and to apply them to results reported in the ClinicalTrials.gov database.
‘About 2% of all researchers admit to having manipulated their data, a new Dutch Fulbright project proposal suggests new methods to tackle these issues.’
Decisions based on bad data, both clinical and policy ones made by medical doctors and governmental institutions, respectively, can pose direct risks on treated patients and the population in general. Such was the case of beta-blockers, for instance, used to be prescribed to cardiac patients in order to decrease perioperative mortality. However, a subsequent meta-analysis detected erroneous data in the related clinical trials. Moreover, it turned out that beta-blockers actually increase the risk of mortality.
The new Dutch Fulbright proposal led by Chris HJ Hartgerink, Tilburg University, Netherlands, and Dr. Stephen L George, Duke University, United States, proposes new additional statistical methods for erroneous data detection to provide an additional quality control filter for clinical trial results reported in the ClinicalTrials.gov database.
Unfortunately, misleading data is not simply the product of bad practice, but could also result from human error or inadequate data handling. It is not even clear enough how often such mistakes or manipulations occur and have occurred in reality, let alone their prevalence in any science in particular. What is beyond doubt, however, is that additional methods and procedures of detecting bad data are needed in order to minimize the risk of bad decisions being taken when health and wellbeing are at stake.
"Detecting problematic data is a niche field with few experts around the world, despite its importance," further explains Chris HJ Hartgerink. "Systematic application remains absent and this project hopes to push this field into this area. New estimates of how prevalent problematic data are welcome, because we currently rely on self-report measures, which suffer from human bias."
Recently submitted to Fulbright, the 6-month project proposal has now been made open-access by the authors with RIO Journal, an innovative platform publishing all outputs of the research cycle, including: project proposals, data, methods, workflows, software, project reports and research articles.
"A grant proposal is a research output like any other but is only rewarded when it results in funding," says Chris HJ Hartgerink. "We know that many good proposals are rejected and consequently not rewarded. Publishing the grant proposal shows the output, makes it rewardable and can help improve it by post-publication peer review."