Section: Organization Studies
Topic:
Sociology
A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France
Corresponding author(s): Gaymard, Frédéric (frederic.gaymard@inrae.fr)
10.24072/pcjournal.432 - Peer Community Journal, Volume 4 (2024), article no. e77.
Get full text PDF Peer reviewed and recommended by PCIPsychosociology theories indicate that individual evaluation is integral to the recognition of professional activities. Building upon Christophe Dejours’ contributions, this recognition is influenced by two complementary judgments: the “utility” judgment from those in hierarchy and the “beauty” judgment from the peers. The aim of this paper is to elucidate how at INRAE individual assessment of scientists is conducted. This process follows a qualitative and multicriteria-based approach by peers, providing both appreciations and advice to the evaluated scientists (the “beauty” judgment). Furthermore, we expound on how INRAE regularly adapts this process to the evolving landscape of research practices, such as interdisciplinary collaboration or open science, assuring that assessments align with the current approaches of research activities.
Type: Research article
Tagu, Denis 1; Boudet-Bône, Françoise 1; Brard, Camille 2; Legouy, Edith 3; Gaymard, Frédéric 1
@article{10_24072_pcjournal_432, author = {Tagu, Denis and Boudet-B\^one, Fran\c{c}oise and Brard, Camille and Legouy, Edith and Gaymard, Fr\'ed\'eric}, title = {A qualitative and multicriteria assessment of scientists: a perspective based on a case study of {INRAE,} {France}}, journal = {Peer Community Journal}, eid = {e77}, publisher = {Peer Community In}, volume = {4}, year = {2024}, doi = {10.24072/pcjournal.432}, language = {en}, url = {https://peercommunityjournal.org/articles/10.24072/pcjournal.432/} }
TY - JOUR AU - Tagu, Denis AU - Boudet-Bône, Françoise AU - Brard, Camille AU - Legouy, Edith AU - Gaymard, Frédéric TI - A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France JO - Peer Community Journal PY - 2024 VL - 4 PB - Peer Community In UR - https://peercommunityjournal.org/articles/10.24072/pcjournal.432/ DO - 10.24072/pcjournal.432 LA - en ID - 10_24072_pcjournal_432 ER -
%0 Journal Article %A Tagu, Denis %A Boudet-Bône, Françoise %A Brard, Camille %A Legouy, Edith %A Gaymard, Frédéric %T A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France %J Peer Community Journal %D 2024 %V 4 %I Peer Community In %U https://peercommunityjournal.org/articles/10.24072/pcjournal.432/ %R 10.24072/pcjournal.432 %G en %F 10_24072_pcjournal_432
Tagu, Denis; Boudet-Bône, Françoise; Brard, Camille; Legouy, Edith; Gaymard, Frédéric. A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France. Peer Community Journal, Volume 4 (2024), article no. e77. doi : 10.24072/pcjournal.432. https://peercommunityjournal.org/articles/10.24072/pcjournal.432/
PCI peer reviews and recommendation, and links to data, scripts, code and supplementary information: 10.24072/pci.orgstudies.100004
Conflict of interest of the recommender and peer reviewers:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.
[1] La Psychodynamique Du Travail : Objet, Considérations Épistémologiques, Concepts et Prémisses Théoriques 1, Santé mentale au Québec, Volume 29 (2004) no. 1, pp. 243-260 | DOI
[2] About Subjectivity in Qualitative Data Interpretation, International conference Knowledge-Based Organization, Volume 22 (2016) no. 2, pp. 419-424 | DOI
[3] The Return of Work in Critical Theory: Self, Society, Politics, Columbia University Press, 2018 | DOI
[4] L'évaluation du travail à l'épreuve du réel: critique des fondements de l'évaluation une conférence-débat, Sciences en questions, Institut national de la recherche agronomique, Paris, 2003
[5] The Centrality of Work, Critical Horizons, Volume 11 (2010) no. 2, pp. 167-180 | DOI
[6] CSS Guide Book 2020-2024 INRAE 2020-2024 , HAL, 2023 (https://hal.inrae.fr/hal-04097315)
[7] To Rank, Not to Rank, or to Rank Responsibly? (https://sfdora.org/2023/06/07/to-rank-not-to-rank-or-to-rank-responsibly/)
[8] National Research Institute for Agriculture, Food and the Environment (INRAE), France, 2023 (https://sfdora.org/case-study/national-research-institute-for-agriculture-food-and-the-environment-inrae/)
[9] Citation Counts and Journal Impact Factors Do Not Capture Some Indicators of Research Quality in the Behavioural and Brain Sciences, Royal Society Open Science, Volume 9 (2022) no. 8, p. 220334 | DOI
[10] Evaluation des collectifs de recherche : un cadre qui intègre l'ensemble de leurs activités, 2011 (https://hal.inrae.fr/hal-02824345)
[11] Triangulation of Subjectivity, Forum Qualitative Sozialforschung Forum: Qualitative Social Research , Volume 4 (2003) no. 2 | DOI
[12] A New Take on the Categorical Imperative: Gatekeeping, Boundary Maintenance, and Evaluation Penalties in Science, Organization Science, Volume 34 (2023) no. 3, pp. 1090-1110 | DOI
[13] Bibliometrics and Research Evaluation: Uses and Abuses, History and Foundations of Information Science, The MIT Press, Cambridge, Massachusetts, 2016 | DOI
[14] L'effet SIGAPS: la recherche médicale française sous l'emprise de l'évaluation comptable, Zilsel: Science, technique, société, Volume 8 (2021), pp. 145-174 | DOI
[15] Bibliometrics: The Leiden Manifesto for Research Metrics, Nature, Volume 520 (2015) no. 7548, pp. 429-431 | DOI
[16] Towards a new generation of research impact assessment approaches, The Journal of Technology Transfer, Volume 47 (2017) no. 3, pp. 621-631 | DOI
[17] Research Impact Assessment: from ex post to real-time assessment, fteval Journal for Research and Technology Policy Evaluation, Volume 47, 2019, pp. 35-40 | DOI
[18] Towards a Sociology of Meaningful Work, Work, Employment and Society, Volume 36 (2021) no. 5, pp. 798-815 | DOI
[19] The Research Excellence Framework and the 'Impact Agenda': Are We Creating a Frankenstein Monster?, Research Evaluation, Volume 20 (2011) no. 3, pp. 247-254 | DOI
[20] The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity, PLOS Biology, Volume 18 (2020) no. 7, p. e3000737 | DOI
[21] “Actual” and Perceptual Effects of Category Spanning, Organization Science, Volume 24 (2013) no. 3, pp. 684-696 | DOI
[22] The Craftsman, Yale University Press, New Haven, 2008
[23] Ethics in Qualitative Research and Evaluation, Journal of Social Work, Volume 3 (2003) no. 1, pp. 9-29 | DOI
[24] Centrality of Researchers in Reforming Research Assessment: Routes to Improve Research by Aligning Rewards with Open Science Practices (2022) (https://initiative-se.eu/paper-research-assessment/)
[25] Understanding the Concept of Subjectivity in Performance Evaluation and Its Effects on Perceived Procedural Justice across Contexts, Accounting & Finance, Volume 62 (2022) no. 3, pp. 4079-4108 | DOI
[26] Academic Work as Craft: Towards a Qualitative and Multicriteria Assessment, Peer Community in Organization Studies (2024), p. 100004 | DOI
[27] Christophe Dejours (https://fr.wikipedia.org/wiki/Christophe_Dejours)
[28] Indicator Frameworks for Fostering Open Knowledge Practices in Science and Scholarship. Report. , European Commission Directorate-General for Research & Innovation, Directorate G — Research | DOI
Cited by Sources: