See all posts

Understanding Public Access Policy Impacts for Funding Agencies: A case study

February 11, 2021


In the last 10-15 years many funding agencies, in the federal and private sector, have passed public access and data sharing policies for funded research. Funding agencies worldwide understand the importance of broad, public access to research results as a mechanism to advance scientific knowledge, increase research transparency, and support research integrity. While the policies are strong, infrastructure to conduct compliance checking and track research outputs requires significant investments of people and/or significant technology development. Thus, the impact these policies have on the frequency of open access and sharing outputs, is not easily determined.

Within its suite of services, Ripeta can provide funders with a report of the frequency of well-established reporting guidelines from published articles and research outputs. In September 2020, Ripeta partnered with Wellcome to do a retrospective analysis of the change in public access and data sharing in 2016 and 2019. We were curious about the motivations, needs, and usefulness of the report and have asked David Carr, Programme Manager, from Wellcome to provide their perspective on the following questions:

Question & Answer

Q: What were your motivations for requesting the RipetaReport? What were you curious about that the RipetaReport provided you?

  • Wellcome has a strong long-term commitment to maximising the value of research outputs through openness. 
  • We are also committed to promoting rigour and integrity in research we support – making the data and software underlying research findings accessible for scrutiny and replication is a fundamental part of this. 
  • As part of our policy on managing and sharing data, software and materials (which we updated in 2017), we have an explicit requirement that data, software and materials underlying publications should be accessible to other researchers. 
  • In addition, our new Open Access policy (which comes into force in January 2021) requires the inclusion of a statement in all funded publications which describes how underlying data and code can be accessed.
  • Wellcome established a dedicated Open Research team in 2017 to spearhead its work to promote openness to research outputs.  As part of its work, the Open Research team is actively seeking ways to monitor changes in practices and attitudes to openness over time.  
  • We haven’t to this point been able to track the extent to which researchers are meeting requirements to provide access to data and code underlying research finding.  Ripeta’s approach provided an opportunity to gain an insight into this, and to look whether the picture had changed over time. 
  • Specifically, we were curious to see given the increasing focus and momentum of open science over the last few years (including Wellcome’s own policies and activities) how far the situation was changing in practice.

Q: What insights did you gain from the ripetaReport? 

  • The general message for us whilst that there was evidence that the situation had improved between 2016 and 2019 (particularly in terms of data availability statements) there was still a very long way to go (still less than half of papers we funded have such a statement).
  • The fact that the proportion of data availability statements that indicated data was deposited in a repository was relatively low and unchanged at around 35% was a really valuable insight.
  • The proportion of the papers indicating software availability had also increased but was still at a very low level (8%).
  • And the relatively low proportion referencing the identity of the analysis software they used was a useful finding, as a key requirement for reproducibility.

Q: What did you find surprising in the analysis within the ripetaReport?

  • I think that inclusion of data and software availability hadn’t risen slightly higher, and that the proportion of data availability statements including repositories hadn’t risen at all.  Also, I had assumed inclusion of the identity of analysis software would be standard good practice – that it’s absent in under half of papers in 2019 was a bit of an eye-opener.

Q: What actions are you planning to take as a result of the ripetaReport?

  • We included as a key feature of our annual progress stock take, and will strongly consider including it as part of future reports.
  • We are providing some more detailed guidance on data and software availability statements (should go up on the website in early January), including some good practice examples.  This includes making clear that a generic “data available on request” statement is not good practice.
  • Although not directly sparked by the report, we have also initiated a new pilot to support our funded researchers in strengthening outputs management plans, which will help  instill good practice in managing data and software from early-on in our funded research.  This will include more actively encouraging researchers to consider the use of community repositories to deposit their data and software outputs.​

The full version of the Wellcome 2016 & 2019 Transparency Report may be found here.