Section: Organization Studies
Topic: Sociology

A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France

10.24072/pcjournal.432 - Peer Community Journal, Volume 4 (2024), article no. e77.

Get full text PDF Peer reviewed and recommended by PCI
article image

Psychosociology theories indicate that individual evaluation is integral to the recognition of professional activities. Building upon Christophe Dejours’ contributions, this recognition is influenced by two complementary judgments: the “utility” judgment from those in hierarchy and the “beauty” judgment from the peers. The aim of this paper is to elucidate how at INRAE individual assessment of scientists is conducted. This process follows a qualitative and multicriteria-based approach by peers, providing both appreciations and advice to the evaluated scientists (the “beauty” judgment). Furthermore, we expound on how INRAE regularly adapts this process to the evolving landscape of research practices, such as interdisciplinary collaboration or open science, assuring that assessments align with the current approaches of research activities.

Published online:
DOI: 10.24072/pcjournal.432
Type: Research article

Tagu, Denis 1; Boudet-Bône, Françoise 1; Brard, Camille 2; Legouy, Edith 3; Gaymard, Frédéric 1

1 INRAE, Direction de l’Evaluation (DEV), Unité 0837, 147 rue de l'université, 75338 Cedex 07, France
2 INRAE, Direction pour la Science Ouverte (DipSO), Unité 1479, 147 rue de l'université, 75338 Cedex 07, France
3 INRAE, Collège de Direction (CODIR), Unité 0233, 147 rue de l'université, 75338 Cedex 07, France
License: CC-BY 4.0
Copyrights: The authors retain unrestricted copyrights and publishing rights
Web-published in collaboration with: UGA Éditions
@article{10_24072_pcjournal_432,
     author = {Tagu, Denis and Boudet-B\^one, Fran\c{c}oise and Brard, Camille and Legouy, Edith and Gaymard, Fr\'ed\'eric},
     title = {A qualitative and multicriteria assessment of scientists: a perspective based on a case study of {INRAE,} {France}},
     journal = {Peer Community Journal},
     eid = {e77},
     publisher = {Peer Community In},
     volume = {4},
     year = {2024},
     doi = {10.24072/pcjournal.432},
     language = {en},
     url = {https://peercommunityjournal.org/articles/10.24072/pcjournal.432/}
}
TY  - JOUR
AU  - Tagu, Denis
AU  - Boudet-Bône, Françoise
AU  - Brard, Camille
AU  - Legouy, Edith
AU  - Gaymard, Frédéric
TI  - A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France
JO  - Peer Community Journal
PY  - 2024
VL  - 4
PB  - Peer Community In
UR  - https://peercommunityjournal.org/articles/10.24072/pcjournal.432/
DO  - 10.24072/pcjournal.432
LA  - en
ID  - 10_24072_pcjournal_432
ER  - 
%0 Journal Article
%A Tagu, Denis
%A Boudet-Bône, Françoise
%A Brard, Camille
%A Legouy, Edith
%A Gaymard, Frédéric
%T A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France
%J Peer Community Journal
%D 2024
%V 4
%I Peer Community In
%U https://peercommunityjournal.org/articles/10.24072/pcjournal.432/
%R 10.24072/pcjournal.432
%G en
%F 10_24072_pcjournal_432
Tagu, Denis; Boudet-Bône, Françoise; Brard, Camille; Legouy, Edith; Gaymard, Frédéric. A qualitative and multicriteria assessment of scientists: a perspective based on a case study of INRAE, France. Peer Community Journal, Volume 4 (2024), article  no. e77. doi : 10.24072/pcjournal.432. https://peercommunityjournal.org/articles/10.24072/pcjournal.432/

PCI peer reviews and recommendation, and links to data, scripts, code and supplementary information: 10.24072/pci.orgstudies.100004

Conflict of interest of the recommender and peer reviewers:
The recommender in charge of the evaluation of the article and the reviewers declared that they have no conflict of interest (as defined in the code of conduct of PCI) with the authors or with the content of the article.

[1] Alderson, M. La Psychodynamique Du Travail : Objet, Considérations Épistémologiques, Concepts et Prémisses Théoriques 1, Santé mentale au Québec, Volume 29 (2004) no. 1, pp. 243-260 | DOI

[2] Bumbuc, Ş. About Subjectivity in Qualitative Data Interpretation, International conference Knowledge-Based Organization, Volume 22 (2016) no. 2, pp. 419-424 | DOI

[3] Dejours, C.; Deranty, J.-P.; Renault, E.; Smith, N. H. The Return of Work in Critical Theory: Self, Society, Politics, Columbia University Press, 2018 | DOI

[4] Dejours, C. L'évaluation du travail à l'épreuve du réel: critique des fondements de l'évaluation une conférence-débat, Sciences en questions, Institut national de la recherche agronomique, Paris, 2003

[5] Dejours, C.; Deranty, J.-P. The Centrality of Work, Critical Horizons, Volume 11 (2010) no. 2, pp. 167-180 | DOI

[6] Direction de l’Evaluation. National Research Institute for Agriculture Food and the Environment (INRAE) CSS Guide Book 2020-2024 INRAE 2020-2024 , HAL, 2023 (https://hal.inrae.fr/hal-04097315)

[7] Dogan, G. To Rank, Not to Rank, or to Rank Responsibly? (https://sfdora.org/2023/06/07/to-rank-not-to-rank-or-to-rank-responsibly/)

[8] DORA case study National Research Institute for Agriculture, Food and the Environment (INRAE), France, 2023 (https://sfdora.org/case-study/national-research-institute-for-agriculture-food-and-the-environment-inrae/)

[9] Dougherty, M. R.; Horne, Z. Citation Counts and Journal Impact Factors Do Not Capture Some Indicators of Research Quality in the Behavioural and Brain Sciences, Royal Society Open Science, Volume 9 (2022) no. 8, p. 220334 | DOI

[10] EREFIN Evaluation des collectifs de recherche : un cadre qui intègre l'ensemble de leurs activités, 2011 (https://hal.inrae.fr/hal-02824345)

[11] Fichten, W.; Dreier, B. Triangulation of Subjectivity, Forum Qualitative Sozialforschung Forum: Qualitative Social Research , Volume 4 (2003) no. 2 | DOI

[12] Fini, R.; Jourdan, J.; Perkmann, M.; Toschi, L. A New Take on the Categorical Imperative: Gatekeeping, Boundary Maintenance, and Evaluation Penalties in Science, Organization Science, Volume 34 (2023) no. 3, pp. 1090-1110 | DOI

[13] Gingras, Y. Bibliometrics and Research Evaluation: Uses and Abuses, History and Foundations of Information Science, The MIT Press, Cambridge, Massachusetts, 2016 | DOI

[14] Gingras, Y.; Khelfaoui, M. L'effet SIGAPS: la recherche médicale française sous l'emprise de l'évaluation comptable, Zilsel: Science, technique, société, Volume 8 (2021), pp. 145-174 | DOI

[15] Hicks, D.; Wouters, P.; Waltman, L.; De Rijcke, S.; Rafols, I. Bibliometrics: The Leiden Manifesto for Research Metrics, Nature, Volume 520 (2015) no. 7548, pp. 429-431 | DOI

[16] Joly, P.-B.; Matt, M. Towards a new generation of research impact assessment approaches, The Journal of Technology Transfer, Volume 47 (2017) no. 3, pp. 621-631 | DOI

[17] Joly, P.-B.; Matt, M.; Robinson, D. K. R. Research Impact Assessment: from ex post to real-time assessment, fteval Journal for Research and Technology Policy Evaluation, Volume 47, 2019, pp. 35-40 | DOI

[18] Laaser, K.; Karlsson, J. C. Towards a Sociology of Meaningful Work, Work, Employment and Society, Volume 36 (2021) no. 5, pp. 798-815 | DOI

[19] Martin, B. R. The Research Excellence Framework and the 'Impact Agenda': Are We Creating a Frankenstein Monster?, Research Evaluation, Volume 20 (2011) no. 3, pp. 247-254 | DOI

[20] Moher, D.; Bouter, L.; Kleinert, S.; Glasziou, P.; Sham, M. H.; Barbour, V.; Coriat, A.-M.; Foeger, N.; Dirnagl, U. The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity, PLOS Biology, Volume 18 (2020) no. 7, p. e3000737 | DOI

[21] Negro, G.; Leung, M. D. “Actual” and Perceptual Effects of Category Spanning, Organization Science, Volume 24 (2013) no. 3, pp. 684-696 | DOI

[22] Sennett, R. The Craftsman, Yale University Press, New Haven, 2008

[23] Shaw, I. F. Ethics in Qualitative Research and Evaluation, Journal of Social Work, Volume 3 (2003) no. 1, pp. 9-29 | DOI

[24] Susi, T.; Heintz, M.; Hnatkova, E.; Koch, W.; Leptin, M.; Andler, M.; Masia, M.; Garfinkel Centrality of Researchers in Reforming Research Assessment: Routes to Improve Research by Aligning Rewards with Open Science Practices (2022) (https://initiative-se.eu/paper-research-assessment/)

[25] Tran, T.-V.; Järvinen, J. Understanding the Concept of Subjectivity in Performance Evaluation and Its Effects on Perceived Procedural Justice across Contexts, Accounting & Finance, Volume 62 (2022) no. 3, pp. 4079-4108 | DOI

[26] Vijay, D.; Berkowitz, H. Academic Work as Craft: Towards a Qualitative and Multicriteria Assessment, Peer Community in Organization Studies (2024), p. 100004 | DOI

[27] Wikipedia Christophe Dejours (https://fr.wikipedia.org/wiki/Christophe_Dejours)

[28] Wouters, P.; Ràfols, I.; Oancea, A.; Caroline, S.; Kamerlin, L.; Holbrook, J.; Jacob, M. Indicator Frameworks for Fostering Open Knowledge Practices in Science and Scholarship. Report. , European Commission Directorate-General for Research & Innovation, Directorate G — Research | DOI

Cited by Sources:

block.super