Summary
A basic requirement in science is that published findings are sufficiently trustworthy. The current state-of-the-art strategy to assess the trustworthiness of findings involves meta-analyzing all known findings on a topic. Such strategy, however, is seriously flawed because it does not account for various forms of researcher biases that contaminate each study included in a meta-analysis. Consequently, it is currently not possible for researchers (and the public) to determine how much one should trust a published finding based on traditional meta-analyses. To address this problem, we propose here to develop 3 new meta-scientific instruments (or meters) to quantify the 3 most fundamental trustworthiness aspects of a study: method transparency, analytic robustness, and effect replicability. For each instrument, a corresponding metric (i.e., unit of measurement) will be developed to quantify the degree to which a published study exhibits method transparency, analytic robustness, and effect replicability. This will be achieved by applying the metrics to CurateScience.org, a web platform that tracks and quantifies the trustworthiness of studies in a crowdsourced, incremental fashion over time. These transparency instruments will revolutionize how meta-analyses are conducted, substantially improving the quality and validity of meta-analytic conclusions in all scientific fields. This will accelerate cumulative knowledge development and the exploitation of scientific findings to develop solutions to address Horizon’s 2020 societal challenges of European citizens. These new instruments will advance the field of meta-science, but also have the salutary effect of accelerating the uptake of open science practices among social science researchers. The proposed work is directly relevant to European policy objectives related to Open Science and Research Integrity to increase the openness, access to, and re-use of publicly-funded research and data.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/793669 |
Start date: | 01-09-2018 |
End date: | 31-08-2020 |
Total budget - Public funding: | 160 800,00 Euro - 160 800,00 Euro |
Cordis data
Original description
A basic requirement in science is that published findings are sufficiently trustworthy. The current state-of-the-art strategy to assess the trustworthiness of findings involves meta-analyzing all known findings on a topic. Such strategy, however, is seriously flawed because it does not account for various forms of researcher biases that contaminate each study included in a meta-analysis. Consequently, it is currently not possible for researchers (and the public) to determine how much one should trust a published finding based on traditional meta-analyses. To address this problem, we propose here to develop 3 new meta-scientific instruments (or meters) to quantify the 3 most fundamental trustworthiness aspects of a study: method transparency, analytic robustness, and effect replicability. For each instrument, a corresponding metric (i.e., unit of measurement) will be developed to quantify the degree to which a published study exhibits method transparency, analytic robustness, and effect replicability. This will be achieved by applying the metrics to CurateScience.org, a web platform that tracks and quantifies the trustworthiness of studies in a crowdsourced, incremental fashion over time. These transparency instruments will revolutionize how meta-analyses are conducted, substantially improving the quality and validity of meta-analytic conclusions in all scientific fields. This will accelerate cumulative knowledge development and the exploitation of scientific findings to develop solutions to address Horizon’s 2020 societal challenges of European citizens. These new instruments will advance the field of meta-science, but also have the salutary effect of accelerating the uptake of open science practices among social science researchers. The proposed work is directly relevant to European policy objectives related to Open Science and Research Integrity to increase the openness, access to, and re-use of publicly-funded research and data.Status
CLOSEDCall topic
MSCA-IF-2017Update Date
28-04-2024
Images
No images available.
Geographical location(s)