Skip to content

cross-disciplinary survey of acces and assessment criteria of best paper awards

License

Notifications You must be signed in to change notification settings

OLSJUJU/survey_best_paper_awards

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cross-disciplinary survey of acces and assessment criteria of "best paper" awards

CC BY 4.0.

Making "best paper" awards for young researchers more equitable and open.

Background

Research awards can propagate existing biases in academia in terms of rewarding novel results rather than robust and transparent research. They can also contribute to the “Matthew Effect” where already privileged groups become rewarded. As such, revealing which awards do/do not provide equitable access and evaluation can lead to systemic change in how publications, especially these by early career researchers, are assessed and recognised.

Aims

This project aims to conduct a survey of current practices in awarding “best papers” awards by research journals. Such awards are usually given to early career researchers. In this project, we will evaluate assessment criteria and historical biases in gender composition of the names of past awardees. Evaluation will be performed on a sample of awards from most respected journals and societies across disciplines. Our findings will be openly available and disseminated in the research community.

We expect that our findings will contribute to culture change fostering more equitable and open science.

Contributing

We invite early- and mid-career researchers across disciplines to contribute to the project.

Overall:

  • We welcome researchers with all backgrounds and walks of life o contribute to any Stage (see above) of this project.
  • It will be possible to start contributing to this project during an in-person hackathon event at AIMOS2022 conference, 28-30 November 2022, Melbourne, Australa.
  • There will be also a folow-up virual hackathon (or two) after that to enable broad global participation.
  • After this, we will work asynchroniously online until we complete all stages of the project.
  • If you would like to comment on this project or provide suggestions to improve this project, feel free to open an issue.
  • For more details see our Contribution Guide.

Roadmap

The project has main 4 Stages:

Stage 1: Planning

  • Preparing protocol.
  • Piloting.
  • Registration.
  • Hackathon preparation.

Stage 2: Screening

  • Screening journal lists from 27 Scimago Subject Areas rankings
  • Creating award shortlist.
  • Checking shortlists.

Stage 1: Data extraction

  • Data extraction for included awards.
  • Data cross-checking.
  • Collecting additional data.
  • Preliminary analyses.

Stage 4: Analyses and writing

  • Final analyses.
  • Draft report.
  • Contributors feedback.
  • Final report.
  • Sharing data and code in GitHub repository.
  • Preprint in MetaRxiv.
  • Manuscript.

Code of Conduct

We expect all project contributors to familiarise themselves and follow our Code of Conduct.

Maintainer(s)

License

This work is licensed under a Creative Commons Attribution 4.0 International License.

CC BY 4.0.

About

cross-disciplinary survey of acces and assessment criteria of best paper awards

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published