Section: Ecology
Topic: Ecology, Environmental sciences, Evolution

Code-sharing policies are associated with increased reproducibility potential of ecological findings

Corresponding author(s): Sánchez-Tójar, Alfredo (alfredo.tojar@gmail.com); Culina, Antica (Antica.Culina@irb.hr)

10.24072/pcjournal.541 - Peer Community Journal, Volume 5 (2025), article no. e37.

Get full text PDF Peer reviewed and recommended by PCI

Abstract

Software code (e.g., analytical code) is increasingly recognised as an important research output because it improves transparency, collaboration, and research credibility. Many scientific journals have introduced code-sharing policies; however, surveys have shown alarmingly low compliance with these policies. In this study, we expanded on a recent survey of ecological journals with code-sharing policies by investigating sharing practices in a comparable set of ecological journals without code-sharing policies. Our aims were to estimate code- and data-sharing rates, assess key reproducibility-boosting features, such as the reporting of software versioning, and compare reproducibility potential between journals with and without a code-sharing policy. We reviewed a random sample of 314 articles published between 2015 and 2019 in 12 ecological journals without a code-sharing policy. Only 15 articles (4.8%) provided analytical code, with the percentage nearly tripling over time (2015-2016:2.5%, 2018-2019:7.0%). Data-sharing was higher than code-sharing (2015-2016:31.0%, 2018-2019:43.3%), yet only eight articles (2.5%) shared both code and data. Compared to a comparative sample of 346 articles from 14 ecological journals with a code-sharing policy, journals without a code-sharing policy showed 5.6 times lower code-sharing, 2.1 times lower data-sharing, and 8.1 times lower reproducibility potential. Despite these differences, the key reproducibility-boosting features of the two journal types were similar. Approximately 90% of all articles reported the analytical software used; however, for journals with and without a code-sharing policy, the software version was often missing (49.8% and 36.1% of articles, respectively), and exclusively proprietary (i.e., non-free) software was used in 16.7% and 23.5% of articles, respectively. Our study suggests that journals with a code-sharing policy have greater reproducibility potential than those without. Code-sharing policies are likely to be a necessary but insufficient step towards increasing reproducibility. Journals should prioritize adopting explicit, easy-to-find, and strict code-sharing policies to facilitate researchers' compliance and should implement mechanisms such as checklists to ensure adherence.

Metadata
Published online:
DOI: 10.24072/pcjournal.541
Type: Research article
Keywords: replicability, reliability, robustness, generalizability, verification, replication, FAIR, checklist

Sánchez-Tójar, Alfredo 1; Bezine, Aya 1; Purgar, Marija 2, 3; Culina, Antica 2, 4

1 Department of Evolutionary Biology, Bielefeld University, Bielefeld, Germany
2 Ruđer Bošković Institute, Zagreb, Croatia
3 Department of Epidemiology, Rollins School of Public Health, Emory University, Atlanta, GA, USA
4 Netherlands Institute of Ecology, NIOO-KNAW, Wageningen, The Netherlands
License: CC-BY 4.0
Copyrights: The authors retain unrestricted copyrights and publishing rights
@article{10_24072_pcjournal_541,
     author = {S\'anchez-T\'ojar, Alfredo and Bezine, Aya and Purgar, Marija and Culina, Antica},
     title = {Code-sharing policies are associated with increased reproducibility potential of ecological findings},
     journal = {Peer Community Journal},
     eid = {e37},
     publisher = {Peer Community In},
     volume = {5},
     year = {2025},
     doi = {10.24072/pcjournal.541},
     language = {en},
     url = {https://peercommunityjournal.org/articles/10.24072/pcjournal.541/}
}
TY  - JOUR
AU  - Sánchez-Tójar, Alfredo
AU  - Bezine, Aya
AU  - Purgar, Marija
AU  - Culina, Antica
TI  - Code-sharing policies are associated with increased reproducibility potential of ecological findings
JO  - Peer Community Journal
PY  - 2025
VL  - 5
PB  - Peer Community In
UR  - https://peercommunityjournal.org/articles/10.24072/pcjournal.541/
DO  - 10.24072/pcjournal.541
LA  - en
ID  - 10_24072_pcjournal_541
ER  - 
%0 Journal Article
%A Sánchez-Tójar, Alfredo
%A Bezine, Aya
%A Purgar, Marija
%A Culina, Antica
%T Code-sharing policies are associated with increased reproducibility potential of ecological findings
%J Peer Community Journal
%D 2025
%V 5
%I Peer Community In
%U https://peercommunityjournal.org/articles/10.24072/pcjournal.541/
%R 10.24072/pcjournal.541
%G en
%F 10_24072_pcjournal_541
Sánchez-Tójar, Alfredo; Bezine, Aya; Purgar, Marija; Culina, Antica. Code-sharing policies are associated with increased reproducibility potential of ecological findings. Peer Community Journal, Volume 5 (2025), article  no. e37. doi : 10.24072/pcjournal.541. https://peercommunityjournal.org/articles/10.24072/pcjournal.541/

PCI peer reviews and recommendation, and links to data, scripts, code and supplementary information: 10.24072/pci.ecology.100778

Conflict of interest of the recommender and peer reviewers:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

Full text

The full text below may contain a few conversion errors compared to the version of record of the published article.

Introduction

Sharing software code is essential for robust, reproducible, and impactful science (Peng, 2011; Borregaard & Hart, 2016; Lewis et al., 2018; Cole et al., 2024). Software code is used to process and analyze data, create figures, and even produce fully executable articles (Mislan et al., 2016; Lasser, 2020), and code complexity is increasing (Touchon & McCoy, 2016; Feng et al., 2020). Code helps with understanding and critically evaluating data analysis and, importantly, can be used and extended by others, allowing faster scientific progress (Cadwallader et al., 2022). The computational reproducibility of scientific findings (i.e., using the same code on the same data to reproduce the same results; Benureau & Rougier, 2018), a seemingly simple but difficult-to-achieve feature in modern science (e.g., Campbell et al., 2023; Kambouris et al., 2024), greatly improves when analytical code is available (Laurinavichyute et al., 2022).

Code availability has been slowly increasing in ecology (Maitner et al., 2024; Sperandii et al., 2024) and other fields (Cao et al., 2023; but see Serghiou et al., 2021), which is likely a consequence of several changes. First, software and software code are being recognised as essential research outputs (DORA: https://sfdora.org/read/; ReSA: https://www.researchsoft.org/; Jay et al., 2021). Second, training and guidelines on reproducible codes and software management are more available to researchers (Donoho et al., 2009; McKiernan, 2017; Kohrs et al., 2023). Third, funders and journals have slowly but steadily introduced code-sharing policies. For example, the percentage of journals with a code-sharing policy increased rapidly for a subset of 96 ecological journals, from 15% in 2015 (Mislan et al., 2016) to 75% in 2020 (Culina et al., 2020). A larger survey of 275 journals in ecology and evolution found that 72% mandated or encouraged code-sharing as of 2024 (Ivimey-Cook et al., 2025). However, there is accumulating evidence that the mere existence of journal code-sharing policies likely increase code availability (Cadwallader et al., 2022; Fišar et al., 2024; Ivimey-Cook et al., 2025), even when policy compliance is alarmingly low. For example, only 27% of articles published between 2015 and 2019 in a subset of 96 ecological journals with a code-sharing policy shared their code (Culina et al., 2020), showing that policies are only partially effective if they are not enforced. In addition, policies that do not specify and require best-sharing practices likely lead to low code reusability and, ultimately, low reproducibility of scientific findings.

Code-sharing does not necessarily translate into code that is easy to understand, adapt, or reuse. Multiple technical challenges to code reuse range from dependencies on the original researcher's computational environment, such as the operating system and libraries used, to inadequate documentation on how to install, run, and use the code (Boettiger, 2015). Code can also easily rot after software updates, leading to changes in functionality, compatibility, and, ultimately, the reproducibility of the results (Hinsen, 2019). Although container technology such as Docker, which packages software and its dependencies into a standardized environment, has been suggested as a solution to improve portability and reproducibility (Boettiger, 2015; Grüning et al., 2018; Essawy et al., 2020; Trisovic et al., 2022), its adoption remains low (e.g., Venkatesh et al., 2022). At a minimum, the software and packages that are fundamental for the analyses should be stated and appropriately referenced in the main manuscript, and the version(s) used should be clearly stated (guidelines for software citation: Chue Hong et al., 2019). The remaining packages should be documented as part of stand-alone documentation (e.g., README, or inline comments; Benureau & Rougier, 2018; Jenkins et al., 2023; Ivimey-Cook et al., 2023). Software and package citation is not only important for computational reproducibility but also for clarifying methodology and giving credit to software developers. In addition, the code should ideally be written using free (i.e., non-proprietary) and open-source software (also known as FOSS; Ostermann & Granell, 2017) such as the free and open-source R programming language (R Core Team, 2023) which is widely used in ecology (Lai et al., 2019; Culina et al., 2020; Kambouris et al., 2024). Furthermore, code should be shared in a permanent repository (e.g., Zenodo, European Organization For Nuclear Research, 2013) and assigned an open and permissive licence and a persistent identifier such as a DOI (Krafczyk et al., 2021; Kim et al., 2022; Jenkins et al., 2023). This is particularly important given the far-from-ideal rates of link persistence found for scientific code in fields such as astrophysics (Allen et al., 2018; Sperandii et al., 2024).

Our main goal was to examine whether the implementation of code-sharing policies by journals leads to higher rates of code-sharing and overall reproducibility potential. For that, we assessed the code-sharing and reporting features of 314 articles published in 12 ecological journals without a code-sharing policy, and compared them with those from the comparable sample of 346 articles published in 14 ecological journals with a code-sharing policy from Culina et al. (2020). We predict that ecological journals without a code-sharing policy will have lower rates of sharing than journals with a code-sharing policy. However, we do not have a clear expectation of whether the reporting of features associated with higher long-term reproducibility, such as the software used, its versioning and accessibility (free or not), and the location where code is shared, will differ between both sets of journals. This is because often code-sharing policies are not explicit (Ivimey-Cook et al., 2025), and thus they might not explicitly prompt authors to follow best practices. Conversely, authors who share their code in the absence of code-sharing policies may already be inclined to follow best practices. Finally, we anticipate that code availability and the reporting of features associated with higher long-term reproducibility will increase over time, regardless of the existence of code-sharing policies, given recent changes in scientific attitudes and norms and the rise of open science (Cao et al., 2023).

Methods

Our study design closely matches that of Culina et al. (2020), who surveyed 14 ecological journals with a code-sharing policy from 2015 to 2019. Thus, readers are referred to Culina et al. (2020) for further information on methodology, and importantly, results for journals with a code-sharing policy. In the follow-up study here we aimed to identify 14 comparable ecological journals without a code-sharing policy for the same period (i.e., 2015-2019). For this, we used a set of 96 ecological journals originally assessed by Mislan et al. (2016) and subsequently reassessed by Culina et al. (2020) and identified 12 journals without a code-sharing policy as of 2020. This was accomplished by carefully reading the author guidelines and open research policies of these journals compiled by Culina et al. (2020). While initially, we identified 24 potentially eligible journals (i.e., without a code-sharing policy), we later removed from the list two review journals (‘Trends in Ecology and Evolution’, and ‘Annual Review of Ecology, Evolution, and Systematics’), nine journals that mentioned code as part of their data-sharing policy (‘Aquatic Microbial Ecology’, ‘Behavioral Ecology and Sociobiology’, ‘Ecology and Evolution’, ‘Global Change Biology’, ‘Journal of Soil and Water Conservation’, ‘Marine Ecology Progress Series’, ‘Microbial Ecology’, ‘Oryx’, and ‘Paleobiology’), and one journal that had been discontinued (‘Journal of the North American Benthological Society’). We judged the remaining 12 journals eligible, as they did not mention programming code or other terms that could be interpreted as such (e.g., script, research artefacts) in their author guidelines (i.e., no code-sharing policy by March 2020; see Table S1 in Culina et al., 2020): ‘Basic and Applied Ecology’, ‘Behavioral Ecology’, ‘Ecosystems’, ‘Freshwater Science’, ‘Frontiers in Ecology and the Environment’, ‘International Journal of Sustainable Development and World Ecology’, ‘Journal of Plant Ecology’, ‘Landscape Ecology’, ‘Oecologia’, ‘Oikos’, ‘Polar Research’, and ‘Wildlife Research’. Note that since the initial screening in March 2020, some of these journals might have adopted a code-sharing policy; however, this would not affect our study, as we focused on articles published between 2015 and 2019. In addition, readers should be aware that our study was designed to allow us to fully compare in detail our results for journals without a code-sharing policy to those with from Culina et al. (2020), so we restricted our searches from 2015 to 2019. Subsequently, the general figures on data or code-sharing and reporting practices reported here might not be representative of current ones, for which we refer to recent surveys (Maitner et al., 2024; Sperandii et al., 2024).

We performed a search in the Web of Science Core Collection (databases covered: Science Citation Index Expanded (SCI-EXPANDED) since 1945, Social Sciences Citation Index (SSCI) since 1956, Arts & Humanities Citation Index (AHCI) since 1975, Emerging Sources Citation Index (ESCI) since 2017) in February 2022, and extracted all the records published in the 12 journals during the same two distinct temporal periods as Culina et al. (2020): (i) from the 1st of June 2015 to the 9th of December 2016 (N = 2499 records); and (ii) from the 1st of January 2018 to the 21st of May 2019 (N = 2275 records). We then took a random sample of 200 articles from each of these two periods (N = 400 in total) using the function ‘sample()’ in R v.4.3.1 (R Core Team, 2023). We screened their titles and abstracts for eligibility using Rayyan software (Ouzzani et al., 2016). To meet our inclusion criteria, an article had to conduct a statistical analysis, develop and run a mathematical model, or conduct simulations. Following Culina et al. (2020), we excluded reviews, opinions, commentaries, and purely bioinformatics studies. In addition, we excluded two articles from the 2018-2019 subset that performed landscape analyses because we lacked expertise to understand the analyses and software used. Each article was screened by two reviewers (AB, AST), and conflicts (~5%) were resolved collectively. In total, 314 non-molecular articles passed the title-and-abstract screening, and their full text was read in detail for data extraction. The screening process is presented in a PRISMA diagram (Figure A1; O’Dea et al., 2021).

Data extraction for each article was conducted by two reviewers (AB and either AST, AC, or MP) to increase the reproducibility and reliability of the data extraction process. Any conflicts were resolved by involving a third reviewer and are marked and explained in the provided data (see ‘data and code availability statement’). For each article, we recorded (i) bibliographic information (title, authors, journal, publication year), (ii) the type of analyses conducted since our interest was only on articles performing statistical analyses and/or simulations, (iii) whether code and data (if used) had been shared (levels: yes, no, partially), (iv) for instances of shared code, we recorded where it was shared (levels: repository, supplementary material, website), and the name of the repository (if used), and (v) several additional key reproducibility-boosting features (i.e., software and additional package(s)/extension(s) (hereafter referred to as “package(s)”) used, number of software and package(s) for which version was provided, and whether the software used was free (i.e., non-proprietary; levels: yes, no, partially; where partially refers to having used both free and non-free software). Figures were generated using the R package ‘ggplot2’ v.3.5.1 (Wickham, 2016).

To test whether the reproducibility potential is higher in journals with versus without a code-sharing policy, we revisited, updated, and extended the dataset used for the analyses presented by Culina et al. (2020). Specifically, for the 346 nonmolecular articles included in Culina et al. (2020), we extracted the package(s) used and the number of software and package(s) for which version was provided. We also checked the variables of interest that had already been collected by Culina et al. (2020) for inconsistencies.

Post hoc decisions we made when processing our data were: (1) whenever packages or extensions were not reported, we assigned the number of packages as 0, even if the software used may not actually have any packages or extensions. We did this because it was not possible for us to find out information about the existence of packages or extensions for all the software reported; (2) for articles that shared some or all of their data only within figures (e.g., in a scatterplot), we assigned them as not sharing their data; (3) we searched for software and package versions within the main article and supplementary text (whenever available), including in their reference lists; (4) in some rare cases when the article did not report the software used but we could infer it from the packages or extensions reported, we assigned the software as “Not Stated”.

Results

Code- and data-sharing

We investigated 314 nonmolecular articles that performed statistical analyses or simulations and were published between 2015 and 2019 (2015-2016:157 articles, 2018-2019:157 articles) in 12 ecological journals without a code-sharing policy as of March 2020. In these 12 journals, the statistical analysis or simulation code underlying the research findings was shared in only 15 of 314 articles (4.8%). These 15 articles were accompanied by either seemingly all (10 articles, 3.2%) or some (five articles, 1.6%) of the code. The overall percentage of code shared increased approximately threefold between the two periods (2.5% versus 7.0% in 2015–2016 and 2018–2019, respectively; Figure 1a). At the journal level, the percentage of articles where code was shared ranged between 0% and 8.7% (median = 1.2%, mean = 3.1%; Table 1), indicating that not sharing code is a general phenomenon in ecological journals without a code-sharing policy. Of the 15 articles that shared code, 12 (80%) provided it as part of the article’s supplementary material, one (6.7%) on a website, and only two (13.3%) in a repository (i.e., Dryad).

Table 1 - Code- and data-sharing for 314 nonmolecular articles that conducted statistical analysis or simulations [i.e., did not use data] published between 2015 and 2019 in 12 ecological journals without a code-sharing policy.

Journal

Total number of articles sampled [using data]

Number of articles providing code (%)

Number of articles providing data (%)

Basic and Applied Ecology

12 [12]

0 (0.0%)

2 (16.7%)

Behavioral Ecology

44 [44]

3 (6.8%)

25 (56.8%)

Ecosystems

23 [23]

2 (8.7%)

10 (43.5%)

Freshwater Science

16 [16]

0 (0.0%)

1 (6.2%)

Frontiers in Ecology and the Environment

4 [4]

0 (0.0%)

2 (50.0%)

International Journal of Sustainable Development and World Ecology

6 [6]

0 (0.0%)

1 (16.7%)

Journal of Plant Ecology

20 [20]

0 (0.0%)

4 (20.0%)

Landscape Ecology

44 [44]

1 (2.3%)

21 (47.7%)

Oecologia

79 [79]

5 (6.3%)

19 (24.1%)

Oikos

42 [40]

3 (7.1%)

25 (62.5%)

Polar Research

7 [7]

0 (0.0%)

4 (57.1%)

Wildlife Research

17 [17]

1 (5.9%)

2 (11.8%)

In the 12 journals without a code-sharing policy, data were shared in 116 of the 312 nonmolecular articles that used data (37.2%). These articles were accompanied by either seemingly all (75 articles, 24.0%) or some (41 articles, 13.1%) of the data, and the overall percentage of shared data increased by approximately 40% over the 5-year period studied (31.0% vs. 43.3%, in 2015–2016 and 2018–2019, respectively; Figure 1b). Furthermore, at the journal level, the percentage of articles in which data were shared ranged between 6.2% and 62.5% (median = 33.8%, mean = 34.4%; Table 1), suggesting large differences in data-sharing across the 12 ecological journals without a code-sharing policy.

Altogether, only 8 (2.5%) articles seemingly shared both all data (if any used) and all code, meaning that the potential for computational reproducibility in the 12 ecological journals without a code-sharing policy surveyed in our study may be as low –or even lower than– 2.5% (Figure 2). This percentage is 8.2 times smaller than the corresponding percentage found in journals with a code-sharing policy (20.8%; Culina et al., 2020).

Figure 1 - Code and data-sharing are uncommon in 12 ecological journals without a code-sharing policy. Percentage of nonmolecular articles surveyed that provided code (a) or data (b) for each of the periods studied (2015–2016:157 articles, 2018–2019:157 articles).

Figure 2 - Diagram visually representing the computational reproducibility potential of articles published between 2015 and 2019 in ecological journals without a code-sharing policy. The value corresponding to “Data incomplete” used in the diagram (56%) was obtained from Roche et al. (2015); whereas all the remaining values correspond to the survey presented in the current study. Original illustration by Szymek Drobniak.

Features boosting long-term reproducibility in journals with and without a code-sharing policy

Our survey showed that 88.2% of articles (N = 277) published in journals without a code-sharing policy stated the analytical software used (Figure 3a), a value that is almost identical to the 89.9% (N = 311) found for articles published in journals with a code-sharing policy (Culina et al., 2020; Figure 3b). For those stating the statistical software used, 63.9% (N = 177) of the articles published in journals without a code-sharing policy reported the version of all software used (Figure 3c), whereas that percentage was 50.2% (N = 156) for articles published in journals with a code-sharing policy (Figure 3d). The mean number of analytical software used was 1.27 (median = 1.00, range = 1–6) in journals without a code-sharing policy and 1.81 (median = 1.00, range = 1–14) in journals with a code-sharing policy. The reporting of software versioning remained slightly higher for journals without a code-sharing policy than for those with a code-sharing policy when expressed as the average percentage of software with version per article (without policy: median = 100%, mean = 67.5%, range: 0–100%; with policy: median = 100%, mean = 59.6%, range: 0–100%).

For articles stating that they used additional packages, 32.4% (N = 46) of the articles published in journals without a code-sharing policy provided the version of all packages used (Figure 3e), whereas that percentage was 19.5% (N = 40) for articles published in journals with a code-sharing policy (Figure 3f). The mean number of packages reported was 2.30 (median = 2.00, range = 1–10) in journals without a code-sharing policy and 2.41 (median = 2.00, range = 1–14) in journals with a code-sharing policy. The reporting of package versioning remained slightly higher for journals without a code-sharing policy than for those with a code-sharing policy when expressed as the average percentage of software with version per article (without policy: median = 33.3%, mean = 45.1%, range: 0–100%; with policy: median = 0%, mean = 30.8%, range: 0–100%).

For articles stating the statistical software used, 23.5% (N = 65) of articles published in journals without a code-sharing policy used exclusively non-free (i.e., proprietary) software (Figure 3g), compared to 16.7% (N = 52) of articles published in journals with a code-sharing policy (Figure 3h).

Discussion

Our results show that code-sharing is almost non-existent (5%) for non-molecular articles published in ecological journals without a code-sharing policy, a figure that is about six times lower than a comparative sample from journals with a code-sharing policy. Data availability fared better, with about one-third of the articles published in ecological journals without a code-sharing policy sharing data, which corresponds to about half the rate observed in journals with a code-sharing policy. These low sharing rates lead to an extremely low reproducibility potential (less than 3%) of the results published in journals without a code-sharing policy. Importantly, this is likely an overestimation because we also found that key reproducibility features (e.g., software name or versioning) were mostly lacking. Overall, our results confirm previous surveys in ecology and other fields: code-sharing is low, and while implementing a code-sharing policy likely increases code-sharing, it does not do so to the desired level. Below, we place our results within and across fields and discuss code-sharing and the importance of explicit policies. We also provide suggestions for journals on how to improve code-sharing and the (long-term) reproducibility of scientific findings (Box 1).

Figure 3. Features boosting long-term reproducibility in journals with (b, d, f, h; right-hand side panels) and without a code-sharing policy (a, c, e, g; left-hand side panels). The red dashed line corresponds to the mean for the category coloured in dark (i.e., Software reported, Software versions reported, Package versions reported, and Free software, respectively). “Partially free” in panels g and h refers to articles that used both free and non-free software.

Open science practices are on the rise. When asked, most scientists agreed with the general open science norms and values decades ago (Anderson et al., 2007), but only recently have we started to see more evidence of scientists not only agreeing but also adhering to such norms and values. For example, a recent survey in social sciences found that the percentage of scientists who self-reported using open science practices increased from 49% in 2010 to 87% in 2020 (Ferguson et al., 2023; see also Borycz et al., 2023). Meta-research studies have confirmed that several transparency indicators, including, but not limited to, data and code-sharing, are on the rise in ecology (Evans, 2016; Culina et al., 2020; Roche et al., 2022a; Maitner et al., 2024) and other fields (Heumüller et al., 2020; Cao et al., 2023; Colavizza et al., 2024; Sharma et al., 2024). Our current survey detected similar trends in ecological journals without a code-sharing policy, with code-sharing tripling from 2015-2016 (2.5%) to 2018-2019 (7.0%). Our results also support the observations from previous meta-research studies on authors being more likely to share data than code in ecology (Culina et al., 2020) and other fields (Bellomo et al., 2024; Sharma et al., 2024). Researchers may perceive greater risks and fewer benefits associated with sharing code compared to data, including unfamiliarity with best sharing practices, insecurity about code quality, fear of misuse or unsolicited appropriation of ideas, and excess preparation costs (Cadwallader & Hrynaszkiewicz, 2022; Gomes et al., 2022), coupled with a lack of incentives for code-sharing. This discrepancy might also be in part due to journal policies often having a stronger emphasis on data than code-sharing (Page et al., 2022; Ivimey-Cook et al., 2025) and is likely less evident in sub-disciplines that rely heavily on computational methods, such as computational biology and software engineering (Heumüller et al., 2020; Cadwallader et al., 2022).


Box 1 - How can journals increase code availability? Below is a list of suggestions for journals sorted by the ease of implementation. For more information, journals should consider contacting the journal liaison officer of the Society for Open, Reliable and Transparent Ecology and Evolutionary Biology (SORTEE; https://www.sortee.org/).

Introduce a code-sharing policy: this can range from simply mandating a code availability statement (Hamilton et al., 2023; Sharma et al., 2024) to encouraging or, in the best case, mandating code-sharing, ideally coupled with policy enforcement (Ivimey-Cook et al., 2025). Policies should be clearly written, explicit and easy to find, and ideally shared among journals within and/or across publishers (Christian et al., 2020).

Implement a reproducibility checklist: this should integrate a minimal list of code-sharing best practices, such as the use of persistent identifiers like DOIs, which ensure long-term accessibility and proper attribution (Gewin, 2016; Trisovic et al., 2022), and ensuring all software and their versioning are provided. Journals should also offer clear guidelines (and support) for authors on how to share code, report software and adhere to reproducibility standards.

Review and verify code: ask authors to share their code upon first submission to allow reviewers access and review the code. Encourage reviewers to use the code (and data) during their reviews (Ivimey-Cook et al., 2025). Consider officially integrating code review as part of the editorial process by adding data and code editors to ensure code functionality and adherence to standards (Krafczyk et al., 2021).


Importantly, our results suggest that journals likely have a central role in increasing code-sharing rates: code-sharing was higher among nonmolecular articles published in journals with a code-sharing policy (27%) than those published in journals without a code-sharing policy (4.8%). A recent survey of meta-analyses in ecology and evolutionary biology detected similar patterns (21.2% and 9.1%, respectively, Kambouris et al., 2024). Previous studies also suggested a link between the introduction of code-sharing policies and a subsequent increase in code availability. For example, code-sharing has increased from 53% in 2019 and 61% in 2020 to 87% in 2022 after the introduction of a mandatory code-sharing policy by PLOS Computational Biology (Cadwallader et al., 2022). Similarly, the percentage of initial submissions providing a link to data and/or code increased from 16.9% in 2021 to 42.6% in 2023 after Ecology Letters changed their sharing policy from simply providing a statement to mandating (and enforcing) providing a link to data and code (Ivimey-Cook et al., 2025; for other examples, in ecology and beyond see Evans, 2016; Hamilton et al., 2023; Ellis et al., 2024; Bellomo et al., 2024; Sperandii et al., 2024). Regardless of whether journals have a code-sharing policy, we also detected an increasing trend in code availability over time. This is likely caused by other factors such as changes in norms, better training, and better support in code-writing and sharing.

While having a policy helps to increase code-sharing, it is certainly not enough without enforcing compliance (Culina et al., 2020). Our previous survey of 14 ecological journals with a code-sharing policy study indicated that the strictness of the policy did not affect code availability because the percentage of articles sharing code was similar between journals with encouraged (mean and range: 29.7% [14–50%], three journals), mandatory (23.0% [22–38%], five journals), and encouraged/mandatory (24.3% [7–53%], six journals) policies (Culina et al., 2020). A recent survey in biomedical research found more promising rates, with up to 50% of articles published in eight journals with a code-sharing policy, making code availability and the likelihood of code-sharing being double in journals with mandatory policies compared to encouraged policies (Sharma et al., 2024). Overall, despite low compliance, which has been linked to factors such as difficult-to-find or unclear written sharing policies (Christian et al., 2020), these examples suggest that even under low policy enforcement, policy interventions can shift research practices towards greater openness. Indeed, implementing a code-sharing policy is a positive step forward, even when resources for enforcing such a policy and reviewing code (e.g., by adopting data and code editors) are not yet available.

We found that features boosting long-term reproducibility, such as using free software and reporting its version, were similar between journals with and without a code-sharing policy, suggesting that, although code-sharing policies seem to increase code availability, they might not increase software reporting without being more explicit about best practices. We found that versions of the statistical software and packages were often missing, and approximately a tenth of the articles did not even state the software used. Reporting software and package versions is important for several reasons. First, they can help to understand and solve technical issues related to software dependencies, which are among the most frequently encountered factors hindering computational reproducibility (Laurinavichyute et al., 2022; Kellner et al., 2024; Samuel & Mietchen, 2024). Different versions of software and/or packages can lead to inconsistencies in results and even to code rot, which occurs when the code relies on specific versions of software or packages that are no longer available or have undergone significant changes (e.g., deprecated functions), rendering the code incompatible with current operating systems (Boettiger, 2015; Laurinavichyute et al., 2022). Second, software reporting standards are key for computational reproducibility (i.e., obtaining the same results using the same input data and code) but also for analytical reproducibility (i.e., obtaining the same results writing fresh code using the provided written methodological descriptions when data but not code are available; Kambouris et al., 2024), and thus should be prioritized by authors and journals alike (Box 1). Finally, about one-fifth of the articles exclusively used non-free (i.e., proprietary) software. Reproducibility is hindered when the code relies on proprietary software that requires a license or subscriptions. Proprietary software restricts access to its source code and is inaccessible to researchers who cannot afford it, ultimately limiting independent verification and building upon original research (Ostermann & Granell, 2017; Benureau & Rougier, 2018; Konkol et al., 2019; Laurinavichyute et al., 2022). Ideally, the code used for a study should be peer reviewed to ensure its completeness, reusability, and reproducibility prior to manuscript acceptance (Ivimey-Cook et al., 2023). Before code review becomes the norm, authors, reviewers, and editors should ensure that the minimum requirements for reproducibility are met, which can be facilitated by the use of checklists and policies explicitly linked to best practices. In addition, we advocate for proper software and package citation to give credit to software developers and incentivise software and package development (e.g., using the R package ‘grateful’; Rodriguez-Sanchez & Jackson, 2024).

Our study has several limitations. Although we matched journals by time and from a seemingly representative list of ecological journals, journals with a code-sharing policy were more likely to have a data-sharing policy too (Ivimey-Cook et al., 2025), which may increase code-sharing simply by increasing the visibility of sharing in general. However, it is fair to assume that some of the 12 journals without a code-sharing policy studied here did not have a data-sharing policy between 2015 and 2019, which may partially account for the lower code-sharing as a by-product. In addition, journals with and without a code-sharing policy may have differed in other transparency indicators or predictors of computational reproducibility, such as the existence of reporting checklists, differences in prestige, or the type of research published. Despite these potential limitations, our study adds to the mounting evidence that journal policies are an important stepping stone to increasing code availability. Finally, a potentially important factor for increasing data and code-sharing that was not explored in our study is the funding source, with evidence suggesting that research funded by competitive grants tends to have higher code- and data-sharing rates, presumably because these funding bodies often have mandates or strong recommendations for sharing as part of their grant conditions (Tan et al., 2024). Thus, we not only call on journals to introduce code-sharing policies but also on funders to make a stronger push for mandating data and code-sharing, regardless of whether they currently have the mechanisms necessary to enforce those policies.

Conclusions

In sum, our study adds to the mounting evidence that code-sharing policies increase code availability, which ultimately enhances the reproducibility potential of scientific findings. Specifically, our study suggests that based on code and data availability, the computational reproducibility potential is about eight times lower in ecological journals without a code-sharing policy (2.5%) compared to those with one (21%). Importantly, however, these should be considered ceiling values, as we also found that software reporting needs improvement to allow reproducibility, and previous studies have found that open code (Obels et al., 2020; Laurinavichyute et al., 2022; Henderson et al., 2024) and data are often incomplete and difficult to use due to poor documentation (Roche et al., 2015; Roche et al., 2022b). The perceived costs and benefits of sharing code and data have been thoroughly studied, dissected, and discussed elsewhere (Soeharjono & Roche, 2021; Gomes et al., 2022; Borycz et al., 2023; Nguyen et al., 2023; Koivisto & Mäntylä, 2024). Low sharing and reporting are key factors that increase research waste in ecology (Purgar et al., 2022) and other fields (Chalmers & Glasziou, 2009), and as such, more efforts are needed to reduce research waste (see additional suggestions in Grainger et al., 2020; Buxton et al., 2021; Purgar et al., 2024). Here, we call on all journals and funders to introduce data and code-sharing policies, even if they currently do not have the resources or mechanisms necessary to enforce them.

Authors contributions

Alfredo Sánchez-Tójar: conceptualisation (equal); data curation (lead); formal analysis (lead); investigation (equal); methodology (lead); project administration (equal); software (lead); supervision (equal); visualization (equal); writing – original draft (lead); writing – review and editing (equal). Aya Bezine: data curation (equal); investigation (equal); writing – review and editing (equal). Marija Purgar: data curation (equal); investigation (equal); validation (equal); writing – review and editing (equal). Antica Culina: conceptualisation (equal); data curation (equal); investigation (equal); project administration (equal); supervision (equal); visualization (equal); validation (equal); writing – original draft (equal); writing – review and editing (equal).

Acknowledgements

We would like to thank Ilona van den Berg and Simon Evans for their original contribution in generating the dataset of articles from journals with a code-sharing policy reanalyzed here; and to the PCI recommender (Ignasi Bartomeus) and two reviewers (Francisco Rodríguez-Sánchez, Verónica Cruz-Alonso) for their comments which helped us improve this contribution. Preprint version 4 of this article has been peer-reviewed and recommended by Peer Community In Ecology (https://doi.org/10.24072/pci.ecology.100778; Bartomeus, 2025).

Funding

This study was supported by the Corona-Unterstützung support program (Bielefeld University) granted to Alfredo Sánchez-Tójar, and by the Croatian Science Foundation under project number HRZZ-IP-2022-10-2872 to Antica Culina. Marija Purgar acknowledges funding from the Croatian Science Foundation (DOK-2021-02-6688) and the Fulbright U.S. Student Program. The views expressed are solely the author’s and not those of Fulbright, the U.S. Government, or the Croatian-American Fulbright Commission.

Conflict of interest disclosure

Alfredo Sánchez-Tójar, Marija Purgar, and Antica Culina are officers of the Society for Open, Reliable, and Transparent Ecology and Evolutionary Biology (SORTEE). Aya Bezine declares no competing interests.

Data, scripts, code and supplementary information availability

All data and code used for this study are available in Zenodo (https://doi.org/10.5281/zenodo.15077314; Sánchez-Tójar et al., 2025).

Appendix

Figure A1- PRISMA diagram detailing the screening procedure and final number of articles included.


References

[1] Allen, A.; Teuben, P. J.; Ryan, P. W. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics, The Astrophysical Journal Supplement Series, Volume 236 (2018) no. 1, p. 10 | DOI

[2] Anderson, M. S.; Martinson, B. C.; De Vries, R. Normative Dissonance in Science: Results from a National Survey of U.S. Scientists, Journal of Empirical Research on Human Research Ethics, Volume 2 (2007) no. 4, pp. 3-14 | DOI

[3] Bartomeus, I. Ensuring reproducible science requires policies, Peer Community in Ecology (2025), p. 100778 | DOI

[4] Bellomo, R. K.; Zavalis, E. A.; Ioannidis, J. P. A. Assessment of transparency indicators in space medicine, PLOS ONE, Volume 19 (2024) no. 4, p. e0300701 | DOI

[5] Benureau, F. C. Y.; Rougier, N. P. Re-run, Repeat, Reproduce, Reuse, Replicate: Transforming Code into Scientific Contributions, Frontiers in Neuroinformatics, Volume 11 (2018), p. 69 | DOI

[6] Boettiger, C. An introduction to Docker for reproducible research, ACM SIGOPS Operating Systems Review, Volume 49 (2015) no. 1, pp. 71-79 | DOI

[7] Borregaard, M. K.; Hart, E. M. Towards a more reproducible ecology, Ecography, Volume 39 (2016) no. 4, pp. 349-353 | DOI

[8] Borycz, J.; Olendorf, R.; Specht, A.; Grant, B.; Crowston, K.; Tenopir, C.; Allard, S.; Rice, N. M.; Hu, R.; Sandusky, R. J. Perceived benefits of open data are improving but scientists still lack resources, skills, and rewards, Humanities and Social Sciences Communications, Volume 10 (2023) no. 1, p. 339 | DOI

[9] Buxton, R. T.; Nyboer, E. A.; Pigeon, K. E.; Raby, G. D.; Rytwinski, T.; Gallagher, A. J.; Schuster, R.; Lin, H.-Y.; Fahrig, L.; Bennett, J. R.; Cooke, S. J.; Roche, D. G. Avoiding wasted research resources in conservation science, Conservation Science and Practice, Volume 3 (2021) no. 2, p. e329 | DOI

[10] Cadwallader, L.; Hrynaszkiewicz, I. A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes, PeerJ, Volume 10 (2022), p. e13933 | DOI

[11] Cadwallader, L.; Mac Gabhann, F.; Papin, J.; Pitzer, V. E. Advancing code sharing in the computational biology community, PLOS Computational Biology, Volume 18 (2022) no. 6, p. e1010193 | DOI

[12] Campbell, T.; Dixon, K. W.; Handcock, R. N. Restoration and replication: a case study on the value of computational reproducibility assessment, Restoration Ecology, Volume 31 (2023) no. 8, p. e13968 | DOI

[13] Cao, H.; Dodge, J.; Lo, K.; McFarland, D. A.; Wang, L. L. The Rise of Open Science: Tracking the Evolution and Perceived Value of Data and Methods Link-Sharing Practices, 2023 | DOI

[14] Chalmers, I.; Glasziou, P. Avoidable waste in the production and reporting of research evidence, The Lancet, Volume 374 (2009) no. 9683, pp. 86-89 | DOI

[15] Christian, T.-M.; Gooch, A.; Vision, T.; Hull, E. Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves, PLOS ONE, Volume 15 (2020) no. 3, p. e0230281 | DOI

[16] Chue Hong, N.; Allen, A.; Gonzalez-Beltran, A.; de Waard, A.; Smith ,AM; Robinson, C.; Jones, C.; Bouquin, D.; Katz, D.; Kennedy, D.; Ryder, G.; Hausman, J.; Hwang, L.; Jones, M.; Harrison, M.; Crosas, M.; Wu, M.; Löwe, P.; Haines, R.; Edmunds, S.; Stall, S.; Swaminathan, S.; Druskat, S.; Crick, T.; Morrell, T.; Pollard, T. Software Citation Checklist for Authors, Zenodo (2019) | DOI

[17] Colavizza, G.; Cadwallader, L.; LaFlamme, M.; Dozot, G.; Lecorney, S.; Rappo, D.; Hrynaszkiewicz, I. An analysis of the effects of sharing research data, code, and preprints on citations, PLOS ONE, Volume 19 (2024) no. 10, p. e0311493 | DOI

[18] Cole, N. L.; Kormann, E.; Klebel, T.; Apartis, S.; Ross-Hellauer, T. The societal impact of Open Science: a scoping review, Royal Society Open Science, Volume 11 (2024) no. 6, p. 240286 | DOI

[19] Culina, A.; Van Den Berg, I.; Evans, S.; Sánchez-Tójar, A. Low availability of code in ecology: A call for urgent action, PLOS Biology, Volume 18 (2020) no. 7, p. e3000763 | DOI

[20] Donoho, D. L.; Maleki, A.; Rahman, I. U.; Shahram, M.; Stodden, V. Reproducible Research in Computational Harmonic Analysis, Computing in Science & Engineering, Volume 11 (2009) no. 1, pp. 8-18 | DOI

[21] Ellis, D. A.; Towse, J.; Brown, O.; Cork, A.; Davidson, B. I.; Devereux, S.; Hinds, J.; Ivory, M.; Nightingale, S.; Parry, D. A.; Piwek, L.; Shaw, H.; Towse, A. S. Assessing computational reproducibility in Behavior Research Methods, Behavior Research Methods, Volume 56 (2024) no. 8, pp. 8745-8760 | DOI

[22] Essawy, B. T.; Goodall, J. L.; Voce, D.; Morsy, M. M.; Sadler, J. M.; Choi, Y. D.; Tarboton, D. G.; Malik, T. A taxonomy for reproducible and replicable research in environmental modelling, Environmental Modelling & Software, Volume 134 (2020), p. 104753 | DOI

[23] European Organization For Nuclear Research Zenodo, 2013 | DOI

[24] Evans, S. R. Gauging the Purported Costs of Public Data Archiving for Long-Term Population Studies, PLOS Biology, Volume 14 (2016) no. 4, p. e1002432 | DOI

[25] Feng, X.; Qiao, H.; Enquist, B. J. Doubling demands in programming skills call for ecoinformatics education, Frontiers in Ecology and the Environment, Volume 18 (2020) no. 3, pp. 123-124 | DOI

[26] Ferguson, J.; Littman, R.; Christensen, G.; Paluck, E. L.; Swanson, N.; Wang, Z.; Miguel, E.; Birke, D.; Pezzuto, J.-H. Survey of open science practices and attitudes in the social sciences, Nature Communications, Volume 14 (2023) no. 1, p. 5401 | DOI

[27] Fišar, M.; Greiner, B.; Huber, C.; Katok, E.; Ozkes, A. I.; the Management Science Reproducibility Collaboration Reproducibility in Management Science, Management Science, Volume 70 (2024) no. 3, pp. 1343-1356 | DOI

[28] Gewin, V. Data sharing: An open mind on open data, Nature, Volume 529 (2016) no. 7584, pp. 117-119 | DOI

[29] Gomes, D. G. E.; Pottier, P.; Crystal-Ornelas, R.; Hudgins, E. J.; Foroughirad, V.; Sánchez-Reyes, L. L.; Turba, R.; Martinez, P. A.; Moreau, D.; Bertram, M. G.; Smout, C. A.; Gaynor, K. M. Why don't we share data and code? Perceived barriers and benefits to public archiving practices, Proceedings of the Royal Society B: Biological Sciences, Volume 289 (2022) no. 1987, p. 20221113 | DOI

[30] Grainger, M. J.; Bolam, F. C.; Stewart, G. B.; Nilsen, E. B. Evidence synthesis for tackling research waste, Nature Ecology & Evolution, Volume 4 (2020) no. 4, pp. 495-497 | DOI

[31] Grüning, B.; Chilton, J.; Köster, J.; Dale, R.; Soranzo, N.; Van Den Beek, M.; Goecks, J.; Backofen, R.; Nekrutenko, A.; Taylor, J. Practical Computational Reproducibility in the Life Sciences, Cell Systems, Volume 6 (2018) no. 6, pp. 631-635 | DOI

[32] Hamilton, D. G.; Hong, K.; Fraser, H.; Rowhani-Farid, A.; Fidler, F.; Page, M. J. Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data, BMJ (2023) | DOI

[33] Henderson, A. S.; Hickson, R. I.; Furlong, M.; McBryde, E. S.; Meehan, M. T. Reproducibility of COVID-era infectious disease models, Epidemics, Volume 46 (2024), p. 100743 | DOI

[34] Heumüller, R.; Nielebock, S.; Krüger, J.; Ortmeier, F. Publish or perish, but do not forget your software artifacts, Empirical Software Engineering, Volume 25 (2020) no. 6, pp. 4585-4616 | DOI

[35] Hinsen, K. Dealing With Software Collapse, Computing in Science & Engineering, Volume 21 (2019) no. 3, pp. 104-108 | DOI

[36] Ivimey-Cook, E. R.; Pick, J. L.; Bairos-Novak, K. R.; Culina, A.; Gould, E.; Grainger, M.; Marshall, B. M.; Moreau, D.; Paquet, M.; Royauté, R.; Sánchez-Tójar, A.; Silva, I.; Windecker, S. M. Implementing code review in the scientific workflow: Insights from ecology and evolutionary biology, Journal of Evolutionary Biology, Volume 36 (2023) no. 10, pp. 1347-1356 | DOI

[37] Ivimey-Cook, E. R.; Sánchez-Tójar, A.; Berberi, I.; Culina, A.; Roche, D. G.; Almeida, R. A.; Amin, B.; Bairos-Novak, K. R.; Balti, H.; Bertram, M.; Bliard, L.; Byrne, I.; Chan, Y.-C.; Cioffi, W. R.; Corbel, Q.; Elsy, A. D.; Florko, K. R. N.; Gould, E.; Grainger, M.; Harshbarger, A. E.; Hovstad, K. A.; Martin, J. M.; Martinig, A. R.; Masoero, G.; Moodie, I. R.; Moreau, D.; O'Dea, R. E.; Paquet, M.; Pick, J. L.; Rizvi, T.; Silva, I.; Szabo, B.; Takola, E.; Thoré, E.; Verberk, W. C. E. P.; Windecker, S. M.; Winter, G.; Zajkova, Z.; Zeiss, R.; Moran, N. P. From Policy to Practice: Progress towards Data- and Code-Sharing in Ecology and Evolution, 2025 | DOI

[38] Jay, C.; Haines, R.; Katz, D. S. Software Must be Recognised as an Important Output of Scholarly Research, International Journal of Digital Curation, Volume 16 (2021) no. 1, p. 6 | DOI

[39] Jenkins, G. B.; Beckerman, A. P.; Bellard, C.; Benítez‐López, A.; Ellison, A. M.; Foote, C. G.; Hufton, A. L.; Lashley, M. A.; Lortie, C. J.; Ma, Z.; Moore, A. J.; Narum, S. R.; Nilsson, J.; O'Boyle, B.; Provete, D. B.; Razgour, O.; Rieseberg, L.; Riginos, C.; Santini, L.; Sibbett, B.; Peres‐Neto, P. R. Reproducibility in ecology and evolution: Minimum standards for data and code, Ecology and Evolution, Volume 13 (2023) no. 5, p. e9961 | DOI

[40] Kambouris, S.; Wilkinson, D. P.; Smith, E. T.; Fidler, F. Computationally reproducing results from meta-analyses in ecology and evolutionary biology using shared code and data, PLOS ONE, Volume 19 (2024) no. 3, p. e0300333 | DOI

[41] Kellner, K. F.; Doser, J. W.; Belant, J. L. Functional R code is rare in species distribution and abundance papers, Ecology (2024), p. e4475 | DOI

[42] Kim, A. Y.; Herrmann, V.; Barreto, R.; Calkins, B.; Gonzalez‐Akre, E.; Johnson, D. J.; Jordan, J. A.; Magee, L.; McGregor, I. R.; Montero, N.; Novak, K.; Rogers, T.; Shue, J.; Anderson‐Teixeira, K. J. Implementing GitHub Actions continuous integration to reduce error rates in ecological data collection, Methods in Ecology and Evolution, Volume 13 (2022) no. 11, pp. 2572-2585 | DOI

[43] Kohrs, F. E.; Auer, S.; Bannach-Brown, A.; Fiedler, S.; Haven, T. L.; Heise, V.; Holman, C.; Azevedo, F.; Bernard, R.; Bleier, A.; Bössel, N.; Cahill, B. P.; Castro, L. J.; Ehrenhofer, A.; Eichel, K.; Frank, M.; Frick, C.; Friese, M.; Gärtner, A.; Gierend, K.; Grüning, D. J.; Hahn, L.; Hülsemann, M.; Ihle, M.; Illius, S.; König, L.; König, M.; Kulke, L.; Kutlin, A.; Lammers, F.; Mehler, D. M.; Miehl, C.; Müller-Alcazar, A.; Neuendorf, C.; Niemeyer, H.; Pargent, F.; Peikert, A.; Pfeuffer, C. U.; Reinecke, R.; Röer, J. P.; Rohmann, J. L.; Sánchez-Tójar, A.; Scherbaum, S.; Sixtus, E.; Spitzer, L.; Straßburger, V. M.; Weber, M.; Whitmire, C. J.; Zerna, J.; Zorbek, D.; Zumstein, P.; Weissgerber, T. L. Eleven strategies for making reproducible research and open science training the norm at research institutions, eLife, Volume 12 (2023), p. e89736 | DOI

[44] Koivisto, E.; Mäntylä, E. Are Open Science instructions targeted to ecologists and evolutionary biologists sufficient? A literature review of guidelines and journal data policies, Ecology and Evolution, Volume 14 (2024) no. 7, p. e11698 | DOI

[45] Konkol, M.; Kray, C.; Pfeiffer, M. Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study, International Journal of Geographical Information Science, Volume 33 (2019) no. 2, pp. 408-429 | DOI

[46] Krafczyk, M. S.; Shi, A.; Bhaskar, A.; Marinov, D.; Stodden, V. Learning from reproducing computational results: introducing three principles and the Reproduction Package, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, Volume 379 (2021) no. 2197 | DOI

[47] Lai, J.; Lortie, C. J.; Muenchen, R. A.; Yang, J.; Ma, K. Evaluating the popularity of R in ecology, Ecosphere, Volume 10 (2019) no. 1, p. e02567 | DOI

[48] Lasser, J. Creating an executable paper is a journey through Open Science, Communications Physics, Volume 3 (2020) no. 1, p. 143 | DOI

[49] Laurinavichyute, A.; Yadav, H.; Vasishth, S. Share the code, not just the data: A case study of the reproducibility of articles published in the Journal of Memory and Language under the open data policy, Journal of Memory and Language, Volume 125 (2022), p. 104332 | DOI

[50] Lewis, K. P.; Vander Wal, E.; Fifield, D. A. Wildlife biology, big data, and reproducible research, Wildlife Society Bulletin, Volume 42 (2018) no. 1, pp. 172-179 | DOI

[51] Maitner, B.; Santos Andrade, P. E.; Lei, L.; Kass, J.; Owens, H. L.; Barbosa, G. C. G.; Boyle, B.; Castorena, M.; Enquist, B. J.; Feng, X.; Park, D. S.; Paz, A.; Pinilla‐Buitrago, G.; Merow, C.; Wilson, A. Code sharing in ecology and evolution increases citation rates but remains uncommon, Ecology and Evolution, Volume 14 (2024) no. 8, p. e70030 | DOI

[52] McKiernan, E. C. Imagining the “open” university: Sharing scholarship to improve research and education, PLOS Biology, Volume 15 (2017) no. 10, p. e1002614 | DOI

[53] Mislan, K. A. S.; Heer, J. M.; White, E. P. Elevating The Status of Code in Ecology, Trends in Ecology & Evolution, Volume 31 (2016) no. 1, pp. 4-7 | DOI

[54] Nguyen, P.; McKenzie, J. E.; Hamilton, D. G.; Moher, D.; Tugwell, P.; Fidler, F. M.; Haddaway, N. R.; Higgins, J. P. T.; Kanukula, R.; Karunananthan, S.; Maxwell, L. J.; McDonald, S.; Nakagawa, S.; Nunan, D.; Welch, V. A.; Page, M. J. Systematic reviewers' perspectives on sharing review data, analytic code, and other materials: A survey, Cochrane Evidence Synthesis and Methods, Volume 1 (2023) no. 2, p. e12008 | DOI

[55] O'Dea, R. E.; Lagisz, M.; Jennions, M. D.; Koricheva, J.; Noble, D. W.; Parker, T. H.; Gurevitch, J.; Page, M. J.; Stewart, G.; Moher, D.; Nakagawa, S. Preferred reporting items for systematic reviews and meta‐analyses in ecology and evolutionary biology: a PRISMA extension, Biological Reviews, Volume 96 (2021) no. 5, pp. 1695-1722 | DOI

[56] Obels, P.; Lakens, D.; Coles, N. A.; Gottfried, J.; Green, S. A. Analysis of Open Data and Computational Reproducibility in Registered Reports in Psychology, Advances in Methods and Practices in Psychological Science, Volume 3 (2020) no. 2, pp. 229-237 | DOI

[57] Ostermann, F. O.; Granell, C. Advancing Science with VGI: Reproducibility and Replicability of Recent Studies using VGI, Transactions in GIS, Volume 21 (2017) no. 2, pp. 224-237 | DOI

[58] Ouzzani, M.; Hammady, H.; Fedorowicz, Z.; Elmagarmid, A. Rayyan—a web and mobile app for systematic reviews, Systematic Reviews, Volume 5 (2016) no. 1, p. 210 | DOI

[59] Page, M. J.; Nguyen, P.-Y.; Hamilton, D. G.; Haddaway, N. R.; Kanukula, R.; Moher, D.; McKenzie, J. E. Data and code availability statements in systematic reviews of interventions were often missing or inaccurate: a content analysis, Journal of Clinical Epidemiology, Volume 147 (2022), pp. 1-10 | DOI

[60] Peng, R. D. Reproducible Research in Computational Science, Science, Volume 334 (2011) no. 6060, pp. 1226-1227 | DOI

[61] Purgar, M.; Glasziou, P.; Klanjscek, T.; Nakagawa, S.; Culina, A. Supporting study registration to reduce research waste, Nature Ecology & Evolution, Volume 8 (2024) no. 8, pp. 1391-1399 | DOI

[62] Purgar, M.; Klanjscek, T.; Culina, A. Quantifying research waste in ecology, Nature Ecology & Evolution, Volume 6 (2022) no. 9, pp. 1390-1397 | DOI

[63] R Core Team R: A language and environment for statistical computing, https://www.R-project.org/, 2023

[64] Roche, D. G.; Berberi, I.; Dhane, F.; Lauzon, F.; Soeharjono, S.; Dakin, R.; Binning, S. A. Slow improvement to the archiving quality of open datasets shared by researchers in ecology and evolution, Proceedings of the Royal Society B: Biological Sciences, Volume 289 (2022) no. 1975, p. 20212780 | DOI

[65] Roche, D. G.; Kruuk, L. E. B.; Lanfear, R.; Binning, S. A. Public Data Archiving in Ecology and Evolution: How Well Are We Doing?, PLOS Biology, Volume 13 (2015) no. 11, p. e1002295 | DOI

[66] Roche, D. G.; Raby, G. D.; Norin, T.; Ern, R.; Scheuffele, H.; Skeeles, M.; Morgan, R.; Andreassen, A. H.; Clements, J. C.; Louissaint, S.; Jutfelt, F.; Clark, T. D.; Binning, S. A. Paths towards greater consensus building in experimental biology, Journal of Experimental Biology, Volume 225 (2022) no. Suppl_1, p. jeb243559 | DOI

[67] Rodriguez-Sanchez, F.; Jackson, C. P. grateful: Facilitate citation of R packages., https://pakillo.github.io/grateful/, 2024 | DOI

[68] Samuel, S.; Mietchen, D. Computational reproducibility of Jupyter notebooks from biomedical publications, GigaScience, Volume 13 (2024) | DOI

[69] Serghiou, S.; Contopoulos-Ioannidis, D. G.; Boyack, K. W.; Riedel, N.; Wallach, J. D.; Ioannidis, J. P. A. Assessment of transparency indicators across the biomedical literature: How open is open?, PLOS Biology, Volume 19 (2021) no. 3, p. e3001107 | DOI

[70] Sharma, N. K.; Ayyala, R.; Deshpande, D.; Patel, Y.; Munteanu, V.; Ciorba, D.; Bostan, V.; Fiscutean, A.; Vahed, M.; Sarkar, A.; Guo, R.; Moore, A.; Darci-Maher, N.; Nogoy, N.; Abedalthagafi, M.; Mangul, S. Analytical code sharing practices in biomedical research, PeerJ Computer Science, Volume 10 (2024), p. e2066 | DOI

[71] Soeharjono, S.; Roche, D. G. Reported Individual Costs and Benefits of Sharing Open Data among Canadian Academic Faculty in Ecology and Evolution, BioScience, Volume 71 (2021) no. 7, pp. 750-756 | DOI

[72] Sperandii, M. G.; Bazzichetto, M.; Mendieta‐Leiva, G.; Schmidtlein, S.; Bott, M.; de Lima, R. A. F.; Pillar, V. D.; Price, J. N.; Wagner, V.; Chytrý, M. Towards more reproducibility in vegetation research, Journal of Vegetation Science, Volume 35 (2024) no. 1, p. e13224 | DOI

[73] Sánchez-Tójar, A.; Bezine, A.; Purgar, M.; Culina, A. Data and Code for "Code-sharing policies are associated with increased reproducibility potential of ecological findings", Zenodo (2025) | DOI

[74] Tan, A. C.; Webster, A. C.; Libesman, S.; Yang, Z.; Chand, R. R.; Liu, W.; Palacios, T.; Hunter, K. E.; Seidler, A. L. Data sharing policies across health research globally: Cross‐sectional meta‐research study, Research Synthesis Methods, Volume 15 (2024) no. 6, pp. 1060-1071 | DOI

[75] Touchon, J. C.; McCoy, M. W. The mismatch between current statistical practice and doctoral training in ecology, Ecosphere, Volume 7 (2016) no. 8 | DOI

[76] Trisovic, A.; Lau, M. K.; Pasquier, T.; Crosas, M. A large-scale study on research code quality and execution, Scientific Data, Volume 9 (2022) no. 1, p. 60 | DOI

[77] Venkatesh, K.; Santomartino, S. M.; Sulam, J.; Yi, P. H. Code and Data Sharing Practices in the Radiology Artificial Intelligence Literature: A Meta-Research Study, Radiology: Artificial Intelligence, Volume 4 (2022) no. 5 | DOI

[78] Wickham, H. ggplot2: Elegant Graphics for Data Analysis, Springer Cham, 2016 no. 2197-5736 | DOI