Relationship between Sinaes evaluators’ competence and experience in evaluation
DOI:
https://doi.org/10.18222/eae.v34.9951Keywords:
Evaluation, Competencies, Experience, SinaesAbstract
This study seeks to analyze the relationship between the competence and experience in evaluation of the Sistema Nacional de Avaliação da Educação Superior [National System for the Evaluation of Higher Education evaluators] (Sinaes). A Delphi study was carried out to list the evaluation competencies considered most appropriate to the context of the Sinaes. A survey was then carried out to identify their experience in evaluation and their self-assessment of the development of their competences. The empirical data showed that in all the theoretical dimensions of competences the greater the number of assessments carried out, the greater the level of development of these competences. It is recommended that more experienced evaluators work in partnership with less experienced ones in order to develop evaluation competencies.
Downloads
References
American Evaluation Society (AEA). (2018). The 2018 AEA-evaluator competencies. AEA. https://www.eval.org/Portals/0/Docs/AEA%20Evaluator%20Competencies.pdf
Aotearoa New Zealand Evaluation Association (Anzea). (2011). Evaluator competencies. Anzea. https://anzea.org.nz/assets/Key-ANZEA-Files/110801_anzea_evaluator_competencies_final.pdf
Australasian Evaluation Society (AES). (2013). Evaluator’s Professional Learning Competency Framework. AES. https://www.aes.asn.au/images/AES_Evaluators_Competency_Framework.pdf
Badalos, D. L. (2018). Measurement theory and applications for the social sciences. Guilford.
Brown, T. A. (2006). Confirmatory factor analysis for applied research. Guilford.
Canadian Evaluation Society (CES). (2018). Competencies for Canadian evaluation practice. CES. https://evaluationcanada.ca/files/pdf/2_competencies_cdn_evaluation_practice_2018.pdf
Castro, L. C. da S. (2021). Competências e experiência em avaliação: Uma análise dos avaliadores externos do Sinaes [Dissertação de mestrado profissional, Universidade Federal da Bahia]. Repositório Institucional da UFBA. https://repositorio.ufba.br/handle/ri/35525
Dalkey, N., & Helmer, O. (1962). An experimental application of the Delphi method to the use of experts. The Rand Corporation. https://www.rand.org/content/dam/rand/pubs/research_memoranda/2009/RM727.1.pdf
Daniel, W. W. (2000). Aplied nonparametric statistics (2nd ed.). Cengage Learning.
Deng, L., & Chan, W. (2017). Testing the difference between reliability coefficients Alpha and Omega. Educational and Psychological Measurement, 77(2), 185-203. https://doi.org/10.1177/0013164416658325 DOI: https://doi.org/10.1177/0013164416658325
Diaz, J., Chaudhary, A. K., Jayaratne, K. S. U., & Assan, E. (2020). Expanding evaluator competency research: Exploring competencies for program evaluation using the context of non-formal education. Evaluation and Program Planning, 79, Article e101790. https://doi.org/10.1016/j.evalprogplan.2020.101790 DOI: https://doi.org/10.1016/j.evalprogplan.2020.101790
Diretoria de Avaliação da Educação Superior (Daes). (2021). Pedido de informação ao Inep registrado sob o protocolo n. 23546-060999/2021. http://www.consultaesic.cgu.gov.br/busca/SitePages/Principal.aspx
Distefano, C. (2002). The impact of categorization with confirmatory factor analysis. Structural Equation Modeling, 9(3), 327-346. https://doi.org/10.1207/S15328007SEM0903_2 DOI: https://doi.org/10.1207/S15328007SEM0903_2
European Evaluation Society (EES). (2011). EES Evaluation Capabilities Framework. EES.
Flora, D. B., & Curran, P. J. (2004). An empirical evaluation of alternative methods of estimation for confirmatory factor analysis with ordinal data. Psychological Methods, 9(4), 466-491. DOI: https://doi.org/10.1037/1082-989X.9.4.466
Hair, J. F., Black, W. C., Babin. B. J., & Anderson, R. E. (2009). Análise multivariada de dados (6a ed.). Bookman.
Halaydna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. Routledge. DOI: https://doi.org/10.4324/9780203850381
Hsu, C.-C., & Sandford, B. A. (2007). The Delphi technique: Making sense of consensus. Practical Assessment, Research and Evaluation, 12, Article 10. https://doi.org/10.7275/pdz9-th90
Kaesbauer, S. A. M. (2012). Teaching evaluator competencies: An examination of doctoral programs [PhD dissertation, University of Tennessee]. Tennessee Research and Creative Exchange. https://trace.tennessee.edu/utk_graddiss/1314/
King, J. A., & Stevahn, L. (2015). Competencies for program evaluators in light of adaptive action: What? So what? Now what? Competencies for Program Evaluators in Light of Adaptive Action. New Directions for Evaluation, (145), 21-37. https://doi.org/10.1002/ev.20109 DOI: https://doi.org/10.1002/ev.20109
King, J. A., Stevahn, L., Ghere, G., & Minnema, J. (2001). Toward a taxonomy of essential evaluator competencies. American Journal of Evaluation, 22(2), 229-247. https://doi.org/10.1177/109821400102200206 DOI: https://doi.org/10.1177/109821400102200206
Kline, R. B. (2016). Principles and practice of structural equation modeling. Guilford.
Portaria Normativa n. 840, de 24 de agosto de 2018. (2018). Dispõe sobre os procedimentos de competência do Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira – Inep referentes à avaliação de instituições de educação superior, de cursos de graduação e de desempenho acadêmico de estudantes. https://download.inep.gov.br/educacao_superior/avaliacao_institucional/legislacao_normas/2018/portaria_normativa_GM-MEC_n840_de_24082018.pdf
Rosseel, Y. (2012). Iavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2). https://doi.org/10.18637/jss.v048.i02 DOI: https://doi.org/10.18637/jss.v048.i02
Stevahn, L., King, J., Ghere, G., & Minnema, J. (2005a). Evaluator competencies in university-based evaluation training programs. The Canadian Journal of Program Evaluation, 20(2), 101-123. https://doi.org/10.3138/cjpe.20.006 DOI: https://doi.org/10.3138/cjpe.20.006
Stevahn, L., King, J., Ghere, G., & Minnema, J. (2005b). Establishing essential competencies for program evaluators. American Journal of Evaluation, 26(1), 43-59. https://doi.org/10.1177/1098214004273180 DOI: https://doi.org/10.1177/1098214004273180
Trevisan, M. S. (2002). Enhancing practical evaluation training through long-term evaluation projects. American Journal of Evaluation, 23(1), 81-92. https://doi.org/10.1016/S1098- 2140(01)00163-1 DOI: https://doi.org/10.1016/S1098-2140(01)00163-1
Trevisan, M. S. (2004). Practical training in evaluation: A review of the literature. American Journal of Evaluation, 25(2), 255-272. https://doi.org/10.1016/j.ameval.2004.03.002 DOI: https://doi.org/10.1016/j.ameval.2004.03.002
UK Evaluation Society (UKES). (2012). UK Evaluation Society Framework of Evaluation Capabilities. UKES. https://www.evaluation.org.uk/app/uploads/2019/04/UK-Evaluation-Society- Framework-of-Evaluation-Capabilities.pdf
Wehipeihana, N., Bailey, R., Davidson, E. J., & McKegg, K. (2014). Evaluator competencies: The Aotearoa New Zealand Experience. The Canadian Journal of Program Evaluation, 28(3), 49-69. https://doi.org/10.3138/cjpe.0028.007 DOI: https://doi.org/10.3138/cjpe.0028.007
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Luis Carlos da Silva Castro, Roberto Brazileiro Paixão

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
a. Authors retain the copyright and grant the journal the right to first publication.
b. All works are licensed under the Creative Commons Attribution License, which allows the sharing of the paper with acknowledgment of authorship.
Until 2024, Estudos em Avaliação Educacional adopted the Creative Commons Attribution-NonCommercial (CC BY-NC) license for its publications. For texts published from 2025 onwards, the journal will adopt the Creative Commons Attribution (CC BY) license, in line with the principles of Open Science.





