What key scientific quality indicators are you checking for in manuscripts?
We check for quality indicators in three areas.
Research: We check for signals of a research paper to differentiate between research and, for example, a commentary. We would expect a clear study objective, headers, and more for a high-quality research paper.
Professionalism: We perform checks of the authorship, affiliation, ethical statement, and funders present in a manuscript.
Reproducibility: We know it’s not always possible to share data and code, but those are some of the critical indicators of reproducibility.
The specific software that the author used to conduct their data analyses.
Code Availability Statement
A statement that explains how or if one can access a study’s code (in its own individual section offset from the main body of text or part of Disclosures or DAS).
Data Availability Statement (DAS)
A statement (offset from main text) detailing access to a study’s data.
Note: If there is data availability information in a “Supplementary/supporting information/ materials” section, it is not a DAS though it may relate to “Data Location.”
A location that stores, organizes, allows access to, and preserves data. Common repositories are Dryad Digital Repository, Figshare, Harvard Dataverse, and Zenodo.
Author Contribution Statement
A statement detailing each author’s role in the development and publication of the manuscript.
Competing Interests Statement
A statement acknowledging any interests of the authors that may not be fully apparent and that could impact the authors’ judgment about the study topic, including information about funding, past or current employment, or stocks owned by one of the authors.
Ethical Approval Statement
Statement of where ethical approval for a study was obtained - especially for studies with human or animal subjects.
A statement within the manuscript indicating whether or not the authors received funding for their research.
Is this an integrity indicator?
In short, yes. This is a quality check for the indicators that are typically used when determining a research publication’s integrity.
Are your scientific quality indicator checks automated?
Yes! Our quality indicator checks are automated. We use humans to train the algorithms to reduce potential biases, but the checks themselves are automated. You can get results within seconds.
How were the scientific quality indicators determined?
Ripeta was created by researchers for researchers. In addition to drawing from our expertise, we looked across a vast amount of guidelines that exist from different societies, publishers, researchers, etc. We conducted a literature search to understand, at the highest level, what first checks for the quality we need.
For more information on the data collection process, see our paper “Repeat: a framework to assess empirical reproducibility in biomedical research” here and our outline of our process here.
What if I only want to know about the result of each of the indicators?
We can give you the result of each indicator. Even though we summarize the results for a macro-view of the main subject areas, you can see the individual indicator scores as well.
Do you search for protocols?
Currently, no. We are working on this at the moment and would appreciate your input.
Do you do pilots?
Absolutely! We love pilots to help you understand the benefits and what we can offer.
How do your products embed in a workflow?
Funding agency In a funding agency workflow, ripeta tools review how data management, data sharing, and open science policies are implemented in practice. Looking across a corpus of full-text articles published with funding from a particular agency (or sub-office), Ripeta extracts our key responsible reporting indicators and provides critical information about how funded researchers implement open policies.
Our tools can be embedded in a couple of places in a publisher’s workflow, but we customize this to best fit your workflow and priorities for publishing the highest quality research.
Before article submission: We can work with you to create a button and journal submission pipeline within your editorial workflow platform so that the author can automatically check their manuscript before they submit it.
After article submission: Automate your responsible reporting checking by integrating Ripeta tools into your editorial workflow platform. In this scenario, after the manuscript has been submitted, a copy is shared with Ripeta – where our automated tools will check it for key responsible reporting indicators. This check’s results can be sent directly to the authors for improvement, directly to the editors for decision-making, or both.
After publishing: Checking for quality reporting indicators after publication and making those results public will also increase trust in science. We can work with you to ensure the final published paper also has a shareable Ripeta report indicating how its results compare against the quality indicators.
Researchers With our chrome extension, researchers can get a ripetaReport for any paper with a DOI.* It’s as easy as downloading our extension from the Chrome Webstore! *If you upload to a preprint server, it gets a DOI minted, but there are also private ways to do this.
Can you help me understand the impact of my [publisher, institution, funder]’s public access policy for articles and data?
Publisher Our experts will assess your journal policy and work with you to understand the key elements that you would like to have reviewed for a ripetaReport. The ripetaReport will provide an in depth analysis of your manuscripts as it relates to your publication policy requirements.
The institution: Whether being presented for publication or for funding grants, the academic institutions want their papers to be at the best quality. We want to work with you to understand how well your papers are adhering to your policy requirements.
Funding Agency For funding agencies we will work with you to gain a better understanding of how well your funded research is adhering to your policy requirements.
What problems do you solve for…?
Publishers The publishers’ goal is to speed up the accuracy of science and to get more citations. Through these automated checks, you can better understand the quality of the papers you are receiving or reviewing. You also have the option to give that information back to the author and to the editors to streamline the process, and in turn, perform better and faster science.
Institutions Institutions need a way to have better quality research done at a faster speed. Ripeta does this by helping their researchers in the review process with ripetaReview.
Funders Funders need to understand how their investments work for them and how these policies are put into practice. This can be done in a couple of ways: assessing your portfolio with ripetaReport and helping your researchers by using ripetaReview before they even get published.
Researchers Researchers do not always know what funders and publishers are checking for in a manuscript. We keep an eye on that so that scientists can keep an eye on their research. We help researchers report higher quality science.
What is the ripetaScore and how is it calculated?
The ripetaScore is a way to view the quality of the research quickly.
The ripetaScore is made up of three areas of trust: research, professionalism, and reproducibility. Some of the indicators within these areas are weighted higher than others, eg., reproducibility has more weight than professionalism. We break down those indicators so that you can see where the most improvement is needed, or simply how you are doing in those areas.
How does the ripetaScore compare across disciplines? Is there any normalization?
One of the things that we are understanding is what the score means in different disciplines. We know that the score will vary. For example, data sharing with hospital medicine is less common, but there are still other ways to strengthen the research.
Right now, we are still observing how these scores compare across disciplines. From what we have seen, the scoresvary based on the predominant discipline reflected in journals who favor specific disciplines.
What is the difference between the ripetaReview and ripetaReport?
ripetaReview provides an integrity check on one paper at a time, whereas the ripetaReport presents its check as an aggregation over many publications.
What is the accuracy of your ripetaReview?
Which we have over 90% accuracy for each of the indicators we use. See What is the accuracy of your scientific integrity indicators?
Error 500: Why won’t my DOI scan?
Learn the types of manuscript DOIs the ripetaReview Chrome extension scans for, and what to do if your paper should scan, but does not.
The ripetaReview chrome extension demo is early in development, and we are limited in the licensing of the manuscripts we can accept due to current legal restrictions.
The current version of the ripetaReview Chrome extension demo scans manuscripts with the creative commons attribution of:
While every journal is formatted differently, the licensing for the manuscript can usually be found in the copyright notice and in the sidebar of the article.
My manuscript fits the requirements, but still does not scan
If you have checked that your manuscript is either CC-BY or CC-0 and still does not scan, then there is a good chance that the article’s licensing is not being correctly reflected in the metadata, or that our systems have a problem parsing and analyzing the paper. Either way, you can send your article to firstname.lastname@example.org and we can attempt to manually enter it into our system. Depending on our workload, this may take up to 3 business days to complete.
ripetaReview Chrome Extension Data Permissions: Chrome extension can “read and change” all your data?
When downloading the ripetaReview google chrome extension, you (and many of us over at Ripeta) may have noticed, or been alarmed by, the data sharing notice that appears before agreeing to add the extension to your browser.
So, what does this notice mean? Why is it needed? And, most importantly, what does Ripeta do with your data?
What does the notice mean?
The data sharing notice that appears before adding a chrome extension is a generic message created by Google for most extensions in the Google Chrome Webstore, but excluding more simple extensions such as Google Hangout. It is designed to declare the permissions the application needs in order to function properly. On the back-end, the permission is simply called “scripting.”
Why are the permissions needed?
The notice sounds daunting, and it should. The permissions notice gives consumers pause before downloading extensions from the internet. It allows users to reassess how much they trust the app, and decide whether or not they want to permit access to their information. For more information on how and why Google Chrome utilizes permissions, check out this article on How-To Geek.
What does Ripeta do with your data?
Ripeta is a company built for and based in Trust. Our products and services are designed to scan and assess the trustworthiness of research, and provide users with the information they need to determine whether or not a resource should be questioned. Our values are centered on supporting transparent and clean practices among researchers, funders, institutions, publishers, and most importantly, ourselves. Because of this, we promise not to use or change any of your data beyond the basic operating needs of the ripetaReview chrome extension.
We also encourage you to reach out to our team with any further questions you may have about how and why we use your data. Visit our website at ripeta.com for more information.
What are your team’s qualifications?
Founded by Dr. McIntosh who comes from research, and Cynthia Hudson-Vitale who comes from data librarianship. We originated in the research arena to inform our work and are supported by a team of data scientists, engineers, marketing specialists, and researchers — all from various backgrounds.
Where does the name “Ripeta” come from?
Ripeta comes from the Italian word for repeat: ripetere [ri-pè-te-re] ripeˈtɛːre.
We focus on assessing the quality of reporting and the robustness of the scientific method.