Our Trust Markers
What kinds of papers do you check?
We analyze scientific research papers.
What key trust markers are you checking for in manuscripts?
We check for trust markers in three areas.
We check for signals of a research paper to differentiate between research and, for example, a commentary. We would expect a clear study objective, headers, and more for a high-quality research paper.
We perform checks of the authorship, affiliation, ethical statement, and funders present in a manuscript.
We know it’s not always possible to share data and code, but those are some of the critical indicators of reproducibility.
|Trust in…||Trust Markers||Definition|
|Reproducibility||Analysis Software||The specific software that the author used to conduct their data analyses.|
|Reproducibility||Code Availability Statement||A statement that explains how or if one can access a study’s code (in its own individual section offset from the main body of text or part of Disclosures or DAS).|
|Reproducibility||Data Availability Statement (DAS)||A statement (offset from main text) detailing access to a study’s data. |
Note: If there is data availability information in a “Supplementary/supporting information/ materials” section, it is not a DAS though it may relate to “Data Location.”
|Reproducibility||Repositories||A location that stores, organizes, allows access to, and preserves data. Common repositories are Dryad Digital Repository, Figshare, Harvard Dataverse, and Zenodo.|
|Transparency||Author Contribution Statement||A statement detailing each author’s role in the development and publication of the manuscript.|
|Transparency||Competing Interests Statement||A statement acknowledging any interests of the authors that may not be fully apparent and that could impact the authors’ judgment about the study topic, including information about funding, past or current employment, or stocks owned by one of the authors.|
|Transparency||Ethical Approval Statement||Statement of where ethical approval for a study was obtained – especially for studies with human or animal subjects.|
|Transparency||Funding Statement||A statement within the manuscript indicating whether or not the authors received funding for their research.|
Is this an integrity indicator?
In short, yes. This is a quality check for the trust markers that are typically used when determining a research publication’s integrity.
Are your trust marker checks automated?
Yes! Our trust markers are automated. We use humans to train the algorithms to reduce potential biases, but the checks themselves are automated. You can get results within seconds.
How were the trust markers determined?
Ripeta was created by researchers for researchers. In addition to drawing from our expertise, we looked across a vast amount of guidelines that exist from different societies, publishers, researchers, etc. We conducted a literature search to understand, at the highest level, what first checks for the quality we need.
For more information on the data collection process, see our paper “Repeat: a framework to assess empirical reproducibility in biomedical research” here and our outline of our process here.
What if I only want to know about the result of each of the trust markers?
We can give you the result of each trust marker. Even though we summarize the results for a macro-view of the main subject areas, you can see the individual trust marker scores as well.
Do you search for protocols?
Currently, no. We are working on this at the moment and would appreciate your input.
Do you do pilots?
Absolutely! We love pilots to help you understand the benefits and what we can offer.
How do your products embed in a workflow?
In a funding agency workflow, Ripeta reviews how data management, data sharing, and open science policies are implemented in practice. Looking across a corpus of full-text articles published with funding from a particular agency (or sub-office), Ripeta extracts our key responsible reporting indicators and provides critical information about how funded researchers implement open policies.
Our tools can be embedded in a couple of places in a publisher’s workflow, but we customize this to best fit your workflow and priorities for publishing the highest quality research.
- Before article submission: We can work with you to create a button and journal submission pipeline within your editorial workflow platform so that the author can automatically check their manuscript before they submit it.
- After article submission: Automate your responsible reporting checking by integrating Dimensions Research Integrity preCheck, powered by Ripeta into your editorial workflow platform. In this scenario, after the manuscript has been submitted, a copy is shared with Ripeta – where our automated tools will check it for key responsible reporting trust markers. This check’s results can be sent directly to the authors for improvement, directly to the editors for decision-making, or to both.
- After publishing: Checking for quality reporting trust markers after publication and making those results public will also increase trust in science. We can work with you to ensure the final published paper also has a shareable Ripeta report indicating how its results compare against the trust markers.
Can you help me understand the impact of my [publisher, institution, funder]’s public access policy for articles and data?
Our experts will assess your journal policy and work with you to understand the key elements that you would like to have reviewed for a report. The report will provide an in depth analysis of your manuscripts as it relates to your publication policy requirements.
Whether being presented for publication or for funding grants, the academic institutions want their papers to be at the best quality. We want to work with you to understand how well your papers are adhering to your policy requirements.
For funding agencies we will work with you to gain a better understanding of how well your funded research is adhering to your policy requirements.
What problems do you solve for…?
The publishers’ goal is to speed up the accuracy of science and to get more citations. Through these automated checks, you can better understand the quality of the papers you are receiving or reviewing. You also have the option to give that information back to the author and to the editors to streamline the process, and in turn, perform better and faster science.
Institutions need a way to have better quality research done at a faster speed. Ripeta does this by helping their researchers in the review process.
Funders need to understand how their investments work for them and how these policies are put into practice. This can be done in a couple of ways: assessing your portfolio with Dimensions Research Integrity and helping your researchers by using preCheck before they even get published.
Researchers do not always know what funders and publishers are checking for in a manuscript. We keep an eye on that so that scientists can keep an eye on their research. We help researchers report higher quality science.
What is the difference between the Dimensions Research Integrity preCheck and Dimensions Research Integrity Insights?
DRI preCheck provides an integrity check on one paper at a time, in real time, whereas DRI insights is an aggregation over many publications.
What is the accuracy of your checks?
We have over 90% accuracy for each of the indicators we use.
What are your team’s qualifications?
Founded by Dr. McIntosh who comes from research, and Cynthia Hudson-Vitale who comes from data librarianship. We originated in the research arena to inform our work and are supported by a team of data scientists, engineers, marketing specialists, and researchers — all from various backgrounds.
Where does the name “Ripeta” come from?
Ripeta comes from the Italian word for repeat: ripetere [ri-pè-te-re] ripeˈtɛːre.