Evaluation of Educational Programs in the Context of Remote Education and COVID 19

Authors

DOI:

https://doi.org/10.11606/issn.2176-7262.rmrp.2021.184768

Keywords:

Evaluation of educational programs, Distance education, Covid-19

Abstract

In early 2020 the world was surprised by the emergence and rapid advance of Covid-19. The impact on educational programs was enormous. Since social distancing is one of the few ways to mitigate the pandemic's advance, emergency remote education (ERE) has become an imposition. In this context, it is of great relevance to evaluate these initiatives to quickly promote the necessary changes so that it is possible, at least, to fulfill the objectives of the educational programs.

In this article, we discuss the fundamentals for evaluating educational programs in the context of ERE and Covid-19. We present the basic principles of evaluation applicable from a single class to a course as a whole. We discuss the application of the following models: experimental, Kirkpatrick, logical model, and CIPP (Context, Input, Process, and Product).

The evaluation of the educational activities that used the ERE, implemented during the pandemic in 2020, is an opportunity to review practices and induce substantial changes in the training of next generations of health professionals and academic teachers.

Downloads

Download data is not yet available.

References

Zhou F, Yu T, Du R, Fan G, Liu Y, Liu Z, et al. Clinical course and risk factors for mortality of adult inpatients with COVID-19 in Wuhan, China: a retrospective cohort study. Lancet Lond Engl. 2020 28;395(10229):1054–62.

Bedford J, Enria D, Giesecke J, Heymann DL, Ihekweazu C, Kobinger G, et al. COVID-19: towards controlling of a pandemic. Lancet Lond Engl. 2020 28;395(10229):1015–8.

Koo JR, Cook AR, Park M, Sun Y, Sun H, Lim JT, et al. Interventions to mitigate early spread of SARS-CoV-2 in Singapore: a modelling study. Lancet Infect Dis. 2020 Jun 1;20(6):678–88.

Goldie J. AMEE Education Guide no. 29: Evaluating educational programmes. Med Teach. 2006 Jan 1;28(3):210–24.

Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide no. 67. Med Teach. 2012;34(5):e288-299.

Tavano PT, Almeida MI. Curso Experimental de Medicina na FMUSP e suas conjecturas de implementação. Khronos. 2017;(4):84–101.

Troncon LE de A, Figueiredo JFC, Rodrigues M de LV, Piccinato CE, Peres LC, Cianflone ARL, et al. Avaliação de uma Reestruturação Curricular na Faculdade de Medicina de Ribeirão Preto: Influência sobre o Desempenho dos Graduandos. Rev Bras Educ Médica. 2004 Aug;28(2):145–55.

Kirkpatrick D. Great Ideas Revisited. Techniques for Evaluating Training Programs. Revisiting Kirkpatrick’s Four-Level Model. Train Dev. 1996;50(1):54–9.

Frechtling JA. Logic Modeling Methods in Program Evaluation. 1a edição. San Francisco: Jossey-Bass; 2007.

Stufflebeam DL, Shinkfield AJ. Evaluation Theory Models and Application. 1st edition. San Francisco, Calif.: Jossey-Bass; 2007. 768 p.

Guiding Principles for Evaluators [Internet]. [cited 2021 Feb 16]. Available from: https://www.eval.org/About/Guiding-Principles

Hardavella G, Aamli-Gaagnat A, Saad N, Rousalova I, Sreter KB. How to give and receive feedback effectively. Breathe. 2017 Dec;13(4):327–33.

Published

2021-08-20

How to Cite

1.
Reis FJC dos, Navarro AM. Evaluation of Educational Programs in the Context of Remote Education and COVID 19. Medicina (Ribeirão Preto) [Internet]. 2021 Aug. 20 [cited 2024 Jun. 13];54(Supl 1):e-184768. Available from: https://www.periodicos.usp.br/rmrp/article/view/184768