Submit Idea, Innovation, Topic for Discussion USING DIGCOMPEDU TO CHART CHANGES IN TEACHER ONLINE ASSESSMENT DIGITAL PRACTICES AND COMPETENCIES AFTER THE PANDEMIC PERIOD
Feb 15 2023 Authors: Angelica Risquez, Maura Adshead, Olha Stepanenko, Mary Fitzpatrick, David Moloney
DOI: 10.1615/IntJInnovOnlineEdu.2023045059
Download print version

USING DIGCOMPEDU TO CHART CHANGES IN TEACHER ONLINE ASSESSMENT DIGITAL PRACTICES AND COMPETENCIES AFTER THE PANDEMIC PERIOD

Angelica Risquez, 1,* Maura Adshead, 2 Olha Stepanenko, 3 Mary Fitzpatrick, 1 & David Moloney 1


1 Centre for Transformative Learning, University of Limerick, Limerick, Ireland
2 Department of Politics and Public Administration, University of Limerick, Limerick, Ireland
3 University of Limerick, Limerick, Ireland


*Address all correspondence to: Angelica Risquez, Centre for Transformative Learning, University of Limerick, Limerick, Ireland, E-mail: angelica.risquez@ul.ie


The European Framework for the Digital Competence of Educators (DigCompEdu) is a robust, evidence-based framework that can be used both as a policy guide and an implementation aid for regional and national tools and training programs. It provides a general frame of reference to support the development and enhancement of educator-specific digital competencies. This article explores the use of the DigCompEdu framework in Ireland, looking at its deployment as a tool for analysis in one regional Irish university. We explore the use of the DigCompEdu framework as a mechanism to identify changes in pedagogic practice and competencies around online assessment practice as a consequence of teacher experiences of remote teaching, learning, and assessment during the COVID-19 global pandemic. Our survey questionnaire, which was administered to the teaching staff in an Irish university, categorized respondents according to their alignment with the Area 4 Assessment DigCompEdu proficiency statements. Additionally, a series of open-ended questions enabled respondents to give further details on their online experiences. These were interrogated using thematic analysis, which revealed a consensus on the advantages and disadvantages of the move to online space. However, most significantly, our analysis points not only to shifting digital competencies but also to shifting teacher roles. Remote teaching in the online format required changes to assessment and engendered a shift in the role of the teacher from a more traditional didactic model to one based on moderation and facilitation.

KEY WORDS: online assessment, digital competence, digital competence of educators (DigCompEdu), pandemic


1. INTRODUCTION

1.1 The Sudden Shift toward Online Assessment during the Pandemic Period

The resistance by higher education institutions to evolve their assessment practices to adapt to an increasingly diverse student population has been often reported; however, the necessary pivot to online assessment due to COVID-19 motivated, most notably, a shift from in-person timed exams to online, open-book, or take home exams (O'Neill & Padden, 2022). Given the prevalence of summative, end of semester examinations as a form of assessment in many contexts in higher education, it would be expected that the pandemic period saw an emergence of the online exam format as the automatic translation of that assessment approach into the online environment. It transpired that both students and staff see online exams as a way to save time and add flexibility (Butler-Henderson & Crawford, 2020). However, this solution is not without its challenges. For one, anxiety is likely to be higher for students with less experience taking online exams (Ambikairajah & Tisdell, 2019; Walker & Handley, 2016). In addition, the sudden and remote nature of the pivot to online assessment during the pandemic period did not allow students to prepare for computer-based, online exams in any significant way—especially when these exams required dealing with unfamiliar technology under the pressure of time. Furthermore, much research evidence supports that students tend to care about academic honesty and they fear this is challenging in online assessments (Sutherland et al., 2018). To this end, the literature review by Butler-Henderson and Crawford (2020) revealed that a majority of students think it is easier to cheat in online exams. As a result, Jaap et al. (2021) reported that students prefer an invigilated computer laboratory setting to a remote setting for exams, where identity is not authenticated, as a way to guarantee fairness and transparency. While a wide range of technical solutions to authentication were identified in the Butler-Henderson and Crawford (2020) review, no clear conclusions were reached with respect to the advantages and disadvantages of different approaches or the concerns around the ethical and practical implications of exam proctoring to privacy and equity. Certainly, there was also the issue of potential technical disruption and accessibility barriers to suitable devices, lack of connectivity to the Internet across rural areas, and the challenges posed by the digital divide in general (Goin Kono & Taylor, 2021).

The shift to alternative assessment approaches, such as open-book assessments, for online exams and assessments has brought opportunities for cheating and contract cheating to light. Open-book assessments can make it easier for students to cheat by collaborating with others or by consulting materials that lecturers may not have prescribed or authorized. Contract cheating, where students pay others to complete their assignments or exams, is also a growing concern since it can be difficult to detect and prevent. In their examination of the use of a popular file sharing website, Chegg, by science, technology, engineering, and mathematics students in relation to engaging in contract cheating during the COVID-19 pandemic, Lancaster and Cotarlan (2021) reported that the number of student requests for help with exams and assessments posted to it between April and August 2020 increased by 196% when compared with the same period in 2019. Of particular concern is their assertion that students undertaking an exam can make live posts of their exam questions requesting responses that can provide answers within the short duration of time allotted for some exams. This highlights the potential for cheating and contract cheating during online exams as well as the need for institutions to develop strategies to detect and prevent these activities.

Using web-based proctoring software to monitor the Internet search behavior of students during an online chemistry examination, Schultz et al. (2022) found that a significant number of students searched the Internet during the exam, in which the number and duration of searches were significantly related to exam performance. The authors identified some specific characteristics of students who were more likely to search the Internet during the exam, such as lower prior grades, lower self-reported confidence in the subject, and higher levels of exam anxiety. Additionally, the rise of artificial intelligence (AI) presents significant challenges for higher education with regard to academic integrity. AI-powered technologies, such as language models and machine learning algorithms, can be used to generate high-quality written content, such as essays and reports, making it easier for students to cheat on some written assignments. Upholding academic integrity in the face of these challenges requires consistent and proactive vigilance on the part of higher education institutions. This will require the development, implementation, and regular reviews and updates of effective policies and procedures around academic integrity and cheating. Educational initiatives for both staff and students on the use of appropriate technologies as well as professional development supports and resources on effective assessment design, academic integrity, and avoiding engagement in unethical behavior, as well as clearly articulated consequences, will be fundamental.

Given the challenges posed by online exams, many institutions (as was the case in the Irish university where this study took place) adopted assessment practices that moved away from high-stakes and time-bound online exams (Pitt & Quinlan, 2022). Rather, recommendations focused on alternative modes of online assessment that were new for many, and therefore often required developing complex pedagogical and technical knowledge. The study presented here explores how we proceeded to investigate the development of these changes around online assessment and associated competencies.

1.2 Moving Away from Software Skills: Digital Competence and DigCompEdu

Mishra and Koehler (2006) proposed the now widely accepted technological pedagogical content knowledge (TPCK) framework, or more recently, TPACK (Herring et al., 2016). TPACK is an emergent form of knowledge at the basis of good teaching with technology, which requires a thoughtful interweaving of three key sources of knowledge: pedagogical, subject-specific content, and technological. As Mishra and Koehler (2006, p. 1029) proposed in the original publication, “this knowledge would not typically be held by technologically proficient subject matter experts, or by technologists who know little of the subject or of pedagogy, or by teachers who know little of that subject or about technology.” The logical consequence of TPACK is that there is no single technological solution that applies to every teacher, every educational setting, every cultural context, or every teaching philosophy. The sudden incorporation of technology as a means to face the emergency situation faced by educators during the pandemic period forced them all to confront basic educational issues because, as noted by Mishra and Koehler (2006), the technology or medium reconstructs the dynamic equilibrium among all three elements (i.e., pedagogical, content, and knowledge). For many, the relative newness of the online assessment technical solutions, proposed and supported in each institution as an alternative to traditional assessment formats, forced teachers to deal with the need to develop new digital competencies in order to re-establish that dynamic equilibrium. Therefore, the following question emerged: how will teachers acquire and understand TPACK? For a long time, the standard approach relied heavily on the acquisition of basic software skills. This approach assumes that knowing a technology automatically leads to good teaching with technology, which is inherently problematic for a range of reasons, as highlighted by Mishra and Koehler (2006), such as the rapid rate of technological changes, inappropriate software designs, situated nature of learning, and lack of emphasis on the practical application of technological knowledge.

The rich, complex, and situated perspective implied in the TPACK model requires the development of very different strategies for developing teachers that are oriented toward digital competence. Digital competence can be broadly defined as the confident, critical, and creative use of information and communication technologies to achieve goals related to work, employability, learning, leisure, inclusion, and/or participation in society (Ferrari, 2013). In the European context, the European Framework for the Digital Competence of Educators, referred to as DigCompEdu (Redecker, 2017), which linked to the formerly published European Framework for Digital Competence (Ferrari, 2013), was proposed as a response to the growing awareness among many European member states that educators need a set of digital competencies specific to their profession in order to be able to seize the potential of digital technologies for enhancing and innovating education (Redecker, 2017, p. 8). DigCompEdu is a robust evidence-based framework that can be used both as a policy guide and an implementation aid to for regional and national tools and training programs. It provides a common language and approach that can help the dialogue and exchange of best practices across borders. The framework aims to capture and describe these educator-specific digital competencies by proposing 22 elementary competencies organized in six areas. It also proposes a progression model to help educators reflect, assess, and develop their digital competence through six different stages of digital competence development. This allows for educators to identify the specific steps they can take to boost their competence in a staged process. During the first two stages, Newcomer (A1) and Explorer (A2), educators assimilate new information and develop basic digital practices. During the next two stages, Integrator (B1) and Expert (B2), they apply, further expand, and structure their digital practices. At the highest stages, Leader (C1) and Pioneer (C2), they pass on their knowledge, critique existing practice, and develop new practices (Redecker, 2017, p. 9).

1.3 DigCompEdu and the Period of Emergency Remote Teaching during the COVID-19 Pandemic

During the period of emergency remote teaching, learning, and assessment, there was an unprecedented demand for the development of digital competencies of educators, particularly across the higher education sector. Thus, it could have been expected that, given its influence at a European level (McGarr et al., 2021), DigCompEdu was destined to be widely adopted across higher education institutions as an instrument to diagnose digital capacity needs and provide training and supports accordingly. However, DigCompEdu is surprisingly underrepresented in the academic literature. A search on the online Web of Science platform for DigCompEdu and higher education across all fields rendered just 26 results, 21 of which corresponded to the pandemic period. Of this, only a few studies have been published that go beyond theoretical and statistical explorations of the tool, and only two of them used DigCompEdu as an instrument to understand a particular population of educators in a particular context with respect to intervention (Cabero-Almenara et al., 2021; Pérez-Calderón et al., 2021). Considering the international breadth of the emergency remote teaching situation, it is clear that DigCompEdu was not used in the way that it was intended to be implemented during an emergency period in the higher education sector, when an incomparable pace of development of technological and pedagogical content knowledge was required (Herring et al., 2016). A limitation of both the generic framework and the DigComp and DigCompEdu frameworks is that they do not account for individual differences or external or contextualized factors, and there is certainly a need to adapt the competencies to the particular needs of a specific target group (Alarcón et al., 2020; Ferrari, 2013). This, combined with the reactive provision of online teaching and learning during the pandemic period, could explain this underutilization at this crisis time, and indicates that its use in national and institutional contexts remains underdeveloped.

1.4 An Application of DigCompEdu in the Irish Context

Irish policies relating to all levels of education are peppered with very positive, enthusiastic views of technology in education reflecting the broader techno-positive discourses of the past 20 years (McGarr et al., 2021). However, teacher professional development has not been emphasized until recently (McGarr et al., 2021, p. 490). McGarr et al. (2021) suggested that early policies tended to focus on developing teachers' skills to use technology but not necessarily situate them in the wider context of digital competence and literacy (McGarr et al., 2021, p. 492). In the Irish higher education sector, the National Forum for the Enhancement of Teaching and Learning in Higher Education (2019) raised the profile of digital competence by including it as a key component of a teacher's wider competence in the National Professional Development Framework for All Staff Who Teach in Higher Education (Ramsey, 2019). In this framework, Domain 5 (i.e., personal and professional digital capacity) recognizes explicitly recognizes the following:

[the] importance of personal and professional digital capacity and the application of digital skills and knowledge to professional practice. The domain focuses on the development of personal confidence in digital skills to develop professional competence and the identification of opportunities for technology to support and enhance student learning. (Ramsey, 2019, p. 7)

Further developments are underpinned by the DigCompEdu framework through the Irish University Association National Project, Enhancing Digital Teaching and Learning (EDTL) in Irish Universities. The framework had previously been locally used in the context of accredited programs for academic development, teaching, and learning projects, and one-to-one consultations, where it was used as a planning tool for prioritizing future professional development planning opportunities (Munro, 2020). Through EDTL, seven of Ireland's universities have collaborated to address a common goal of enhancing the digital learning experiences and attributes of Irish university students. A key conduit to achieving this goal was through the creation of sectoral and institutional digital capacity–building professional development opportunities for staff members who teach or support learning. Project activities in each member university were initially framed by local strategic visions and contexts, but unified across the project through a pedagogy-first philosophy and the general adoption of the DigCompEdu framework as a key reference point to help map and develop educators' digital competencies (Flynn et al., 2021). Through its use as part of the EDTL-developed accredited open course, Getting Started with Personal and Professional Digital Capacity, the DigCompEdu framework and associated check-in self-assessment tool gained more widespread exposure and recognition within the Irish higher and further education sector (Flynn et al., 2022).

This paper explores the application of the DigCompEdu framework in one Irish university, the University of Limerick. Using the competencies progression provided by the DigCompEdu framework, we sought to examine the impact of the pandemic on the development of digital competencies in relation to online assessment approaches. Specifically, we were keen to use the DigCompEdu framework to assess the following:

  • Teachers' current level of engagement with tracking student progress, analyzing data, and providing feedback through electronic means, and how it differs from their pre-pandemic practice;

  • The extent to which the level of digital competence in assessment and related teaching practices changed between the pre- and post-pandemic period; and

  • The perceived advantages and disadvantages of changes in digital competencies because of remote teaching and learning during the pandemic.

2. METHODOLOGY

A survey, comprised of both closed and open questions (see Appendix A), was administered to the teaching staff at the University of Limerick during the summer of 2022, once the pandemic emergency had concluded. Using the competency statements from DigCompEdu, teaching staff members were asked to rate their competency levels before and after the experience of remote teaching created by the pandemic. This enabled us to assess the self-identified levels of digital competencies before and after the pandemic. We used further open-ended questions to explore the experiences of the teaching staff with respect to remote teaching and online assessment, noting the ways that these changed during the pandemic and the perceived advantages and disadvantages of changed pedagogic methods. Thematic analysis was used to analyze the responses. The survey instrument was submitted to and approved for use by the research ethics committee of the Faculty of Arts, Humanities, and Social Sciences, University of Limerick.

A total of 66 respondents completed the survey from all areas of the institution (see Table 1). Of these, 60 were the lead teachers in their courses and were normally responsible for conducting all designs and implementing assessments for their course, with the rest being tutors or guest teachers. At the moment of the survey, 41 of these respondents were teaching a total of four or more modules (courses) that semester. The average class size was 69, with a standard deviation of 82.10 (minimum class size = 9; maximum class size = 580).


TABLE 1: Phases of the thematic analysis (source: Braun & Clarke, 2013)

Phase Thematic Analysis
Familiarizing with data Transcribing data
Reading and re-reading the data
Noting initial ideas
Generating initial codes Coding interesting features of the data systematically across the entire data set
Collating data relevant to each code
Searching for themes Collating codes into potential themes
Gathering all data relevant to each potential theme
Reviewing themes Checking if the themes work in relation the coded extracts and the entire data set
Generating a thematic map
Defining and naming themes Ongoing analysis for refining the specifics of each theme and the overall story that the analysis tells
Generating clear definitions and names for each theme
Producing the report The final opportunity for analysis
Selection of vivid, compelling extract examples
Final analysis of selected extracts
Relating the analysis back to the research question and literature
Producing a report of the analysis

2.1 Design of the Survey Questionnaire

The DigCompEdu framework is comprised of three domains, categorized as follows: areas, competencies, and proficiency levels and statements. Each of the six areas in the framework is comprised of a subset of associated competencies, and each competence is further comprised of its own subset of proficiency levels and related proficiency statements (see https://joint-research-centre.ec.europa.eu/digcompedu/digcompedu-framework/digcompedu-proficiency-levels_en). Using the proficiency statements as a barometer against which educators can map their current practices and experiences, the framework provides a self-assessment mechanism for each competence by proposing a general continuum of upward progression from the lowest proficiency level [Newcomer (A1)], upward to higher proficiency levels [Explorer (A2), Integrator (B1), Expert (B2), and Leader (C1)], and finally to the highest proficiency level [Pioneer (C2)]. This self-assessment is based on how accurately educators believe their existing practices and experiences are identified by and most closely align with the proficiency statements presented to them. In this way, educators can use the proficiency statements as a means to self-assess and identify where they think they sit on the continuum for each competence, and cumulatively for each area in the framework at a particular point in time.

Our survey instrument focused on Area 4: assessment. This framework area prompts educators to consider how the use of digital technologies can enhance their existing assessments, while simultaneously prompting consideration for how digital technologies could also be employed to create or facilitate new innovative approaches to assessment. Area 4 is comprised of three distinct—yet interrelated—competencies:

  • Area 4.1 Assessment Strategies: use of digital technologies for formative and summative assessment in order to enhance diversity and suitability of assessment formats and approaches;

  • Area 4.2 Analyzing Evidence: use of digital technologies to access, analyze, and interpret available data in order to inform learning, teaching, and assessment strategies;

  • Area 4.3 Feedback and Planning: use of digital technologies to provide targeted and timely feedback to learners in order to pivot and adapt teaching strategies if needed as a result of the evidence generated by the data.

Our survey exploited the self-assessment mechanism in order to identify changes in assessment approaches over time. For closed questions, survey respondents were presented with the related proficiency statements for all three competencies within Area 4 (assessment) and asked to self-report on their perceived proficiency before and after their remote emergency teaching experience.

Academic colleagues were reminded that the working tool in DigCompEdu is cumulative; therefore, they should choose the highest statements if in doubt when self-assessing. We categorized respondents according to their agreement with the DigCompEdu Area 4 assessment proficiency statements. Each respondent was considered on a case-to-case basis in order to classify answers in the proficiency levels, which implied some level of interpretation since the statements were not strictly incremental. In general terms, we allocated the maximum level of proficiency at which an individual consistently replied yes to all of the items in the level.

All answers were then subjected to thematic analysis in order to examine the impact of the COVID-19 pandemic on the acquisition of digital competencies and subsequent changes in teaching and assessment techniques. Thematic analysis refers to the search for recurrent themes that emerge as important to the description of the phenomenon under study. Since first being recognized as an approach in the 1970s (Merton, 1975), a burgeoning amount of literature has emerged on the uses of thematic analysis for qualitative research (Braun & Clarke, 2013; Guest et al., 2012), much of which has focused on its use in psychology and health sciences (Joffe, 2011; Tuckett, 2005). The process involves the identification of themes through careful reading and re-reading of qualitative data. It is a form of pattern recognition within the data, where the emerging themes become the categories for analysis (Boyatzis, 1998). This approach was systematized into a six-stage process by Braun and Clarke (2013), as detailed in Table 1.

It is, of course, worth noting that themes do not emerge: they are identified by those who search the data, giving rise to all kinds of issues concerning the biases of those who identify the themes and what they select, edit, and choose to deploy in their final arguments. In order to counter this weakness, the data were individually read and transcribed by three researchers, who each carried out Steps 1, 2, and 3 independently before jointly developing a thematic map and inter-coder agreement on the reviewing themes (Step 4). The data were subsequently re-read and reviewed during several group meetings to define and name the agreed upon chosen themes. A hybrid approach to qualitative methods was used to develop this final report (Step 6), which combines a deductive a priori template that incorporates the definitions provided by the DigCompEdu framework plus the data-driven inductive approach used by Boyatzis (1998) to explore the themes emerging from the data.

3. RESULTS AND DISCUSSION: PANDEMIC PROGRESSION ACROSS THE DIGITAL COMPETENCY LEVELS

In addition to the survey questionnaire, 41 respondents answered the open-ended questions, inviting further comments on their experiences using digital assessment corresponding to the pre- and post-pandemic periods. The analysis of the data showed a progression from the lower levels of digital competence in relation to online assessment practices to a higher level of competence. In the pre-pandemic period, 32% of the respondents had self-evaluated as Pioneer; this percentage increased to 56% in the post-pandemic period and the proportion of those that fell in the Leader category also increased from 10% to 17%. On the bottom side of the spectrum, those that self-evaluated as Newcomers or Explorers reduced the total from 39% to 17%. The following subsections explore these results in a more nuanced way.

3.1 Shift of Assessment Approaches

For most respondents, the main assessment focus during the pre-COVID period was based on written examinations in the form of mid-term and final exams (well over 50% of respondents), and to a lesser extent on assessments based on laboratory reports, essays, and case studies. However, during the period of remote teaching and learning the assessment techniques shifted suddenly to the online environment and became more diverse, including online tests and quizzes, online open-book exams, weekly blogs, online presentations, discussion forums, video assignments, e-portfolio assignments, and online debates (see the responses to open-ended questions in Appendix B).

Beyond the shift to the online environment and diversification of the assessment methods, the thematic analysis of the open-ended responses also showed an increased shift in the proportion of continuous and formative assessments during the semester compared to the pre-pandemic status quo, where the emphasis was instead placed on end of semester examinations, as demonstrated by the following respondent quotes:

  • Respondent 19: “Inclusion of more continuous assessment through reading groups, class discussions.”

  • Respondent 31: “Got rid of the exam, brought in more interim assessments and discussions.”

  • Respondent 43: “Midterm and final exam go online or replaced by mini quizzes/tests.”

  • Respondent 8: “I dropped the project work and added weekly blogs to the overall assessment.”

  • Respondent 10: “Adapted assessment type to online, open book exam, public events organised by students went online etc.”

The motivations to move away from summative assessment were often not stated, and they could have been related to the difficulties involved in delivering online exams or out of fear of academic integrity breaches. In several cases, the emergency online teaching situation motivated teachers to remove elements of assessment all together (final exam, oral presentation, written reports, laboratory reports, etc.) or forced them to extend the deadlines for preparation and submission, contributing again to the shift to a more continuous and formative type of assessment, as stated by Respondent 4, “Allowed for longer times and transferred oral presentation to recordings.” The experience of these adjustments encouraged teachers to reduce the number and complexity of tasks required of students, which could signal a shift toward an approach more focused on assessment as/for learning as opposed to the more traditional assessment of learning (Earl & Katz, 2006), as stated by Respondent 27, “Less assessments, easier assessment, less strict on deadlines,” and Respondent 17, “Reduced number of assessments and introduced online quizzes.”

Finally, it was interesting to observe a shift not only of the type of assessment and its timing, but on the role of the teacher as facilitator or moderator of the assessment process. The data indicated that teachers were devoting more time to facilitation and moderation than was previously the case, leaving open the possibility for a shift in role, as they moved toward increased use of formative assessment methods:

Much more coordinated and collaborative approach to moderation—e.g., during online assessments, colleague acted as moderator and we worked synchronously on a shared doc, managed to mark and write up most of the feedback comments jointly rather than having to return to recordings (as I did previously) and write up feedback post event. (Respondent 7)

The implications of this enhanced focus on moderation of the assessment process are discussed in more detail subsequently.

3.2 Advantages of Using New Assessment Approaches

Several respondents noted advantages for both students and teachers to changes in assessments engendered by remote online teaching and learning, which appear to be mutually supportive.

Respondents felt that for students the new assessment methods were likely to promote not only the acquisition of new digital skills but also deeper learning, since students were able to digest curriculum content at their own pace, returning to difficult concepts as they needed. There was a perception that this helped to reduce students' anxiety and promoted self-monitoring and self-control of progress. Certainly, the increased time available for students' assessments (compared to timed, in-person examinations) allowed teachers the scope to modify assessment approaches in response to identified study challenges. Over time, the number of assessments tended to reduce as teachers found a level in relation to each other and the feasible number of remote assessments that students were able to complete for a range of subjects. Overall, it was suggested that during the remote learning period, students had developed higher-order skills and independent learning strengths.

From a teacher perspective, many respondents noted that grading became more routinized and efficient since the submission of work online meant that it was easier to read, making grading quicker, as evidenced by the following quotes:

  • Respondent 22: “I had less grading.”

  • Respondent 20: “Increased efficiency—faster marking—much easier, faster marking typed scripts vs handwritten, no sorting of scripts, quick return of marks and results all entered in online system for ease of review and post-hoc analysis.”

  • Respondent 24: “The exams were easier to read as they were typed.”

  • Respondent 27: “Legible answers, better quality, 99% of the class submitted.”

Where teachers noted more efficiency in time spent grading, they saw a benefit in the time this allowed for more formative assessment and different forms of student engagement with curriculum content, as noted by Respondent 14, “Students were required to engage with the material more quickly but were eased into it.”

Analysis of the respondent's answers showed that well-chosen assessment methods and approaches provided better attainment of learning outcomes due to deeper engagement with material among students and a level of metacognition and independent learning, as explained by Respondent 11, “Weekly mini-quizzes meant students could check their progress—e.g., understanding of lecture material/readings. Focused essay helped build confidence to analyse/criticise. Peer-feedback helped them to better understand what makes for a ‘good’ piece of work.” However, the extent of these changes still tended to vary between disciplines and the variable scope for variation and accommodation in assessment and assignments.

3.3 Disadvantages of Using New Assessment Approaches

Notwithstanding the advantages, teachers noted that the downsides to online assessments were the increased risk of cheating and plagiarism at different stages of assessment. Moreover, just as some students benefitted from alternative assessment approaches and formats, others were less keen, resulting in uneven participation/engagement in the studying and assessment processes across the student cohorts. For some students, the change in assessment approaches and more routinized tests meant that their attention was more focused on assignment completion than on engagement with the curriculum and deeper learning.

Teachers noted the time necessary to set up and monitor new assessment approaches, and over the course of remote teaching and learning the number of assessments typically tended to decrease:

  • Respondent 11: “The lack of proctoring resulted in significant cheating throughout in particular in the online Final Exam.”

  • Respondent 24: “More focus on the assessment towards the end of the module than on the curriculum.”

  • Respondent 23: “In class work is more engaging, presents more effective clinical reasoning and enhances learning among students…”

3.4 New Skills and Competences in Relation to Assessment

The shift to online learning was, of course, accompanied by an observed increased use of the virtual learning environment (VLE) and online platforms for assessment and the development of the digital skills associated with developing online assessment (e.g., production of online question pools, interactive software, etc.). In addition, teachers had to develop other digital soft skills related to managing feedback via online platforms and moderating online discussion forums, as evidenced by the following respondent quotes:

  • Respondent 14: “Became more efficient at designing/grading ‘mass’ online assessments; better at managing collective/individual feedback.”

  • Respondent 34: “I learned how to maximise the use of a VLE.”

  • Respondent 5: “Setting up online assessments using [the institutional VLE].”

The shift to online learning also required teachers to upskill more generally in other aspects of blended learning, such as the flipped classroom model, audio lectures, podcasting, video conferencing and recording, design of online presentations, virtual presentations, use of interactive tools in synchronous teaching, etc., as observed by some respondents:

  • Respondent 29: “Learned how to use Panopto and create captions, use of e-tivities, and use of template for [the institutional VLE].”

  • Respondent 41: “I got better at recording lectures and virtual presentations.”

  • Respondent 17: “Online teaching and learning, delivering blended learning.”

In summary, our results showed that the sudden and forced shift to online assessment, provoked by the pandemic situation, brought a diversification of assessment methods that placed greater emphasis on the formative elements of assessment for/as learning (Earl & Katz, 2006), combined with teachers' increased digital competence in all aspects related to online and multimedia content generation. In turn, this allowed meeting the principle of multiple means of engagement and multiple means of representation advocated through a universal design for learning (UDL) approach (Novak & Couros, 2022).

4. CONCLUSIONS

This study has rendered some interesting insights into the shifting competencies and roles as a result of the sudden pivot to online teaching, learning, and assessment, which are mapped against the assessment area in the DigCompEdu framework. Of special relevance to our findings are the competencies defined as assessment strategies (Area 4.1) and feedback and planning (Area 4.3), while not much evidence emerged to support a significant shift in relation to analyzing evidence (Area 4.2). Therefore, it seems that much scope remains to explore and incentivize the use of digital technologies to be able to access, analyze, and interpret available data to inform learning, teaching, and assessment strategies.

However, it is important to note that our findings are based on teachers' individual perceptions and that the survey asked about pre-pandemic competencies after the pandemic when it is quite possible that memory and perception may have been altered. Nevertheless, it is important to reflect on what occurred since early 2020 and bring that learning into the future. Ní Uigín and Cofaigh (2021) recommended that we should not disregard what we have learned over the pandemic period; rather, we should use DigCompEdu as a useful framework for educators, in the Irish context, to assess their own ability to further enhance their pedagogy using digital tools. The widespread adoption of DigCompEdu as a common framework for digital capacity for educators at the European level can help to ensure a level of uniformity within the education community, thus facilitating a sharing of research, practice, educational resources, professional development materials, etc. (McGarr et al., 2021). We hope that our study makes a contribution to this work by demonstrating how the implementation of the DigCompEdu framework can be used in an exploration of changes in pedagogic practice, as was the case during the period of emergency remote teaching, learning, and assessment required by the pandemic in a specific national context. Further studies of this nature would enable us to grasp a better sense of these changes in a wider European context.

Still, concerns have been expressed about the potential of teacher competence frameworks to reduce teacher autonomy and stifle innovation (Adoniou & Gallagher, 2017). McGarr et al. (2021) claimed that digital competence frameworks may result in a narrowing of possible digital practices such that teachers comply with the expected practices as laid out in the frameworks, but potentially reduce digital competence to a linear and deterministic set of competencies primarily used for teacher accountability measures. These authors also pointed to a danger that the hierarchical categorization of teachers' technology use from novice to expert suggests a desired set of practices that place a greater emphasis on some skills over others:

These categorisations and rankings, and the ideologies that underpin them, often go unchallenged when such frameworks are adopted. Mindful of these potential pitfalls, while such frameworks can help map the complex terrain of teachers' digital competence, their adoption should perhaps act as a guide as opposed to a blueprint. (McGarr et al., 2021, p. 493)

It must also be acknowledged that applying the framework directly, without changes, is not always straightforward. For example, classifying respondents according to their response in the statements in Area 4 was sometimes problematic, since there were cases where they identified with higher levels in the proficiency levels, having responded no to more basic levels. This is coherent with the Munro (2020) insight that the proficiency statements of DigCompEdu can be difficult to interpret, since there is not always a clear progression between levels. Redecker (2017) also warned that the DigiCompEdu framework should not be seen as a normative framework or a tool for performance appraisal. As recommended by Munro (2020), it is important to take a developmental perspective in the use of DigCompEdu, where focus is not on the levels but rather on the improvement of individual objectives.

Mindful of these concerns, we note that in our study the application of DigCompEdu raised issues we had not previously considered. While we began our investigation into shifting digital competencies, the results of our analysis provide indications that the shifting roles were, probably, equally significant. The range of changes to learning and teaching practices that were engendered by the move to remote online teaching and assessment caused this shift. Applying the DigCompEdu framework enabled us to identify a shift of teacher roles from leading assessment to facilitating learning. The online format and changes to assessment that were required by remote teaching—and mapped across the assessment strategies (Area 4.1) competence—required a shift in the role, where the lecturer needed to move from delivering information to creating more engaging forms of learner interactions. Our results have also demonstrated the intimate relationship between these adjustments in pedagogical and assessment design and the premises of multiple means of engagement and representation posed by the UDL approach (Novak & Couros, 2022). This is also linked to Area 5, empowering learners, in the DigCompEdu framework: Area 5.1, accessibility and inclusion; Area 5.2, differentiation and personalization; and Area 5.3, actively engaging learners.

Ultimately, our survey identified that the primary pedagogic changes were not in relation to curriculum content, but rather to the online engagement—a finding that was equally true for students as it was for teachers. These changes led to a rationalization of the amount of work that teachers were asking students to do. Teachers' quick realization that they should not over-assess online created a need to find alternative strategies to keep students engaged. This provided the opportunity for more meaningful engagement of students, rather than just keeping them busy, and highlighted the importance of more careful alignment between learning outcomes and assessment strategy. These findings strongly resonated with the conclusions from the extensive literature review by Pitt and Quinlan (2022), where the use of technology for assessment and feedback was methodically examined according to the extent to which it can be used to challenge students, sustain student effort, promote meaningful interaction, facilitate working with diverse others, provide feedback, increase application to the real world, reflect on and integrate learning, and develop evaluative judgment. Our results confirm these authors' conclusion that the use of e-portfolios, simulations, blogs, peer review tools, automated assessments, or gamified quizzes, to name a few, really makes a difference when trying to meet the learning principles in their checklist. At the core of their use, it is crucial that their use is aligned with intended learning outcomes and that costs and benefits are weighed, including staff time.

REFERENCES

Adoniou, M., & Gallagher, M. (2017). Professional standards for teachers—what are they good for? Oxford Review of Education, 43(1), 109–126. https://doi.org/10.1080/03054985.2016.1243522

Alarcón, R., del Pilar Jiménez, E., & de Vicente-Yagüe, M. I. (2020). Development and validation of the DIGIGLO, a tool for assessing the digital competence of educators. British Journal of Educational Technology, 51(6), 2407–2421. https://doi.org/10.1111/bjet.12919

Ambikairajah, A., & Tisdell, C. C. (2019). E-examinations and the student experience regarding appropriateness of assessment and course quality in science and medical science. Journal of Educational Technology Systems, 47(4), 460–478. https://doi.org/10.1177/0047239518822016

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Sage Publications, Inc.

Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners. Sage.

Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159, 104024. https://doi.org/10.1016/j.compedu.2020.104024

Cabero-Almenara, J., Barroso-Osuna, J., Gutiérrez-Castillo, J.-J., & Palacios-Rodríguez, A. (2021). The teaching digital competence of health sciences teachers. A study at Andalusian universities (Spain). International Journal of Environmental Research and Public Health, 18(5), 1–13. https://doi.org/10.3390/ijerph18052552

Earl, L. M., & Katz, S. (2006). Rethinking classroom assessment with purpose in mind: Assessment for, as and of learning. Winnepig: Manitoba Education, Citizenship and Youth. Retrieved October 15, 2022, from https://www.edu.gov.mb.ca/k12/assess/wncp/full_doc.pdf

Ferrari, A. (2013). DIGCOMP: A framework for developing and understanding digital competence in Europe. European Commission, Joint Research Centre, Institute for Prospective Technological Studies.

Flynn, S., Lowney, R., Molloy, K., Munro, M., & Stone, S. (2022). Getting started with personal & professional digital capacity: An open course for educators in Irish higher education [Conference presentation]. Irish Learning Technology Association (ILTA) EdTech Winter Conference. https://youtu.be/V0oDHv7UPqE

Flynn, S., Munro, M., Byrne, J., Hamill, D., Molloy, K., Moloney, D., O'Callaghan, C., O'Connor, M, O'Reilly, M. Scrochi, C., & Stone, S. (2021). Digital learning and teaching post COVID-19: Learning from the enhancing digital teaching 6 and learning (EDTL) approach. In M. Keane, C. McAvinia, & Í. O'Sullivan (Eds.), Emerging issues IV: Changing times, changing context (pp. 92–111). Educational Developers in Ireland Network (EDIN).

Goin Kono, K., & Taylor, S. (2021). Using an ethos of care to bridge the digital divide: Exploring faculty narratives during a global pandemic. Online Learning, 25(1), 151–165. http://dx.doi.org/10.24059/olj.v25i1.2484

Guest, G., MacQueen, K. M., & Namey, E. E. (2012). Applied thematic analysis. Sage.

Herring, M., Koehler, M., & Mishra, P. (2016). Handbook of technological pedagogical content knowledge (TPACK) for educators (2nd ed.). Routledge.

Jaap, A., Dewar, A., Duncan, C., Fairhurst, K., Hope, D., & Kluth, D. (2021). Effect of remote online exam delivery on student experience and performance in applied knowledge tests. BMC Medical Education, 21(1), 86. https://doi.org/10.1186/s12909-021-02521-1

Joffe, H. (2011). Thematic analysis. In D. Harper and A. R. Thompson (Eds.), Qualitative methods in mental health and psychotherapy: a guide for students and practitioners (pp. 209–224). Wiley.

Lancaster, T., & Cotarlan, C. (2021). Contract cheating by STEM students through a file sharing website: A COVID-19 pandemic perspective. International Journal for Educational Integrity, 17, 3. https://doi.org/10.1007/s40979-021-00070-0

McGarr, O., Mifsud, L., & Colomer Rubio, J. C. (2021). Digital competence in teacher education: Comparing national policies in Norway, Ireland and Spain. Learning, Media and Technology, 46(4), 483–497. https://doi.org/10.1080/17439884.2021.1913182

Merton, R. K. (1975). Thematic analysis in science: Notes on Holton's concept. Science, 188(4186), 335–338. https://doi.org/10.1126/science.188.4186.335

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x

Munro, M. (2020). Embedding DigCompEdu in professional development. IUADigEd Webinar. Retrieved January 17, 2023, from https://edtl.blog/webinar-series/embedding-digcompedu-in-professional-development/

National Forum for the Enhancement of Teaching and Learning in Higher Education. (2019). National Professional Development Framework: Domains. Retrieved January 17, 2023, from https://www.teachingandlearning.ie/resource/national-professional-development-framework-domains/

Ní Uigín, D., & Cofaigh, É. Ó. (2021). Blending learning—from niche to norm. Irish Educational Studies, 40(2), 227–233. https://doi.org/10.1080/03323315.2021.1933566

Novak, K., & Couros, G. (2022). UDL Now!: A teacher's guide to applying universal design for learning (3rd ed.). CAST, Inc.

O'Neill, G., & Padden, L. (2022). Diversifying assessment methods: Barriers, benefits and enablers. Innovations in Education and Teaching International, 59(4), 398–409. https://doi.org/10.1080/14703297.2021.1880462

Pérez-Calderón, E., Prieto-Ballester, J.-M., & Miguel-Barrado, V. (2021). Analysis of digital competence for Spanish teachers at pre-university educational key stages during COVID-19. International Journal of Environmental Research and Public Health, 18(15), 8093. https://doi.org/10.3390/ijerph18158093

Pitt, E., & Quinlan, M. (2022). Impacts of higher education assessment and feedback policy and practice on students: A review of the literature 2016–2021. Centre for the Study of Higher Education, University of Kent. Retrieved December 15, 2022, from https://www.advance-he.ac.uk/knowledge-hub/impacts-higher-education-assessment-and-feedback-policy-and-practice-students-review

Ramsey, L. (2019). National professional development framework: Domains. National Forum for the Enhancement of Teaching and Learning in Higher Education.

Redecker, C. (2017). European framework for the digital competence of educators: DigCompEdu. Retrieved December 15, 2022, from https://joint-research-centre.ec.europa.eu/digcompedu_en

Schultz, M., Lim, K. F., Goh, Y. K., & Callahan, D. L. (2022). OK Google: What's the answer? Characteristics of students who searched the Internet during an online chemistry examination, Assessment & Evaluation in Higher Education, 47(8), 1458–1474. https://doi.org/10.1080/02602938.2022.2048356

Sutherland, D., Warwick, P., Anderson, J., & Learmonth, M. (2018). How do quality of teaching, assessment, and feedback drive undergraduate course satisfaction in U.K. business schools? A comparative analysis with nonbusiness school courses using the U.K. national student survey. Journal of Management Education, 42(5), 618–649. https://doi.org/10.1177/1052562918787849

Tuckett, A. G. (2005). Applying thematic analysis theory to practice: A researcher's experience. Contemporary Nurse, 19(1-2), 75–87. https://doi.org/10.5172/conu.19.1-2.75

Walker, R., & Handley, Z. (2016). Designing for learner engagement with computer-based testing. Research in Learning Technology, 24. https://doi.org/10.3402/rlt.v24.30083


APPENDIX A: SURVEY QUESTIONS

Appendix A1. Introduction

The purpose of the project was to co-create with our Erasmus+ partners teaching and learning resources that will enhance the digital pedagogical competencies of teaching staff and promote high-quality and inclusive digital education, with special attention to online assessment practice. In order to do this, we aimed to assess the evenness (or otherwise) of changes in practice that occurred during the pandemic. The survey included a range basic demographic data regarding teaching background and discipline, some open-ended questions to explore subjective and personal views around teaching and assessment practice, and validated items to measure assessment-related digital competency from Area 4 (Assessment) in the DigCompEdu framework (https://digital-competence.eu/digcompedu/). In doing so, we aimed to explore the following research questions:

  • How did teaching staff members adjust their assessment approach during the period or emergency remote teaching?

  • What is their current (post-pandemic) assessment approach?

  • What is teachers' current level of engagement with tracking of student progress, analyzing data, and providing feedback through electronic means, and how is it different from their pre-pandemic practice?

Appendix A2. Questions

The following questions and formats were employed in the survey:

  • Faculty (multiple choice);

  • Department (multiple choice);

  • Role (module leader/tutor/guest teacher);

  • Number of modules you are teaching (numeric answer);

  • Average class size (numeric answer);

  • Did you adjust your assessment approach during the period of emergency remote teaching due to COVID during 2020 and 2021 (yes/no); if yes, what did you change (open ended);

  • Did you find any advantages to your new assessment approaches (yes/no); if yes, can you please explain (open ended);

  • Did you find any disadvantages to your new assessment approaches (yes/no); if yes, can you please explain (open ended);

  • Did you develop any new skills and competences in relation to assessment (yes/no); can you please explain (open ended);

  • Did you develop any new teaching and learning skills and competences more generally (yes/no); can you please explain (open ended);

  • Are you reverting to your pre-pandemic assessment approach (yes/no); can you please explain (open ended).

The participants were asked to respond to questions in relation to three areas (i.e., assessment strategies, evidence analysis, and feedback and planning) involving their assessment practice before and after the pandemic situation (see Tables A1–A3).


TABLE A1: Assessment strategies

Assessment Strategy Assessment Strategy Used
Pre-Pandemic Currently
I do not or only very rarely use digital assessment formats. Yes/no Yes/no
I use digital technologies to create assessment tasks, which are then administered in paper format. Yes/no Yes/no
I plan for students' use of digital technologies in assessment tasks, e.g., in support of assignments. Yes/no Yes/no
I use some existing digital technologies for formative or summative assessment, e.g., digital quizzes, e-portfolios, and games. Yes/no Yes/no
I adapt digital assessment tools to support my specific assessment goal, e.g., creating tests using a digital test system. Yes/no Yes/no
I use a range of e-assessment software programs, tools, and approaches for formative assessment, both in the classroom and for learners to use after school. Yes/no Yes/no
I select different assessment formats based on the one that most adequately captures the nature of the learning outcome to be assessed. Yes/no Yes/no
I design digital assessments that are valid and reliable. Yes/no Yes/no
I use a variety of digital and non-digital assessment formats, aligned with content and technology standards, and am aware of their benefits and drawbacks. Yes/no Yes/no
I critically reflect on my use of digital technologies for assessment and adapt my strategies accordingly. Yes/no Yes/no
I develop new digital formats for assessment, which reflect innovative pedagogic approaches and allow for the assessment of transversal skills. Yes/no Yes/no

TABLE A2: Analyzing the evidence on the assessment strategies

Assessment Strategy Assessment Strategy Used
Pre-Pandemic Currently
I do not or only very rarely refer to digitally recorded data to understand where my students stand. Yes/no Yes/no
I evaluate administrative data (e.g., attendance) and data on student performance (e.g., grades) for individual feedback and targeted interventions. Yes/no Yes/no
I am aware that digital assessment tools (e.g., quizzes and voting systems) can be used within the teaching process to provide me with timely feedback on learners' progress. Yes/no Yes/no
I evaluate the data resulting from digital assessments to inform learning and teaching. Yes/no Yes/no
I am aware that the data on my learners' activity, as it is recorded in the digital environments that I use with them, can help me monitor their progress and provide them with timely feedback and assistance. Yes/no Yes/no
I use digital technologies (e.g. quizzes, voting systems, and games) within the teaching process to provide me with timely feedback on learners' progress. Yes/no Yes/no
I use the data analysis tools provided by the digital environments I use to monitor and visualize activity. Yes/no Yes/no
I interpret the data and evidence available in order to better understand individual learners' needs for support. Yes/no Yes/no
I continuously monitor digital activity and regularly reflect on digitally recorded learner data to timely identify and react upon critical behavior and individual problems. Yes/no Yes/no
I evaluate and synthesize the data generated by the various digital technologies I use to reflect on the effectiveness and suitability of different teaching strategies and learning activities (in general and for certain learner groups). Yes/no Yes/no
I implement advanced data generation and visualization methods into the digital activities I employ, e.g., based on learning analytics. Yes/no Yes/no
I critically assess and discuss the value and validity of different data sources as well as the appropriateness of established methods for data analysis. Yes/no Yes/no

TABLE A3: Feedback and planning with respect to the assessment strategies

Assessment Strategy Assessment Strategy Used
Pre-Pandemic Currently
I am not aware how digital technologies can help me in providing feedback to learners or adapting my teaching strategies. Yes/no Yes/no
I use digital technologies to compile an overview on learners' progress, which I use as a basis for offering feedback and advice. Yes/no Yes/no
I use digital technology to grade and give feedback on electronically submitted assignments. Yes/no Yes/no
I help students and/or parents to access information on learners' performance using digital technologies. Yes/no Yes/no
I adapt my teaching and assessment practices based on the data generated by the digital technologies I use. Yes/no Yes/no
I provide personal feedback and offer differentiated support to learners based on the data generated by the digital technologies used. Yes/no Yes/no
I use digital technologies to enable learners and parents to remain updated on progress and make informed choices on future learning priorities, optional subjects, or future studies. Yes/no Yes/no
I assist learners in identifying areas for improvement and jointly develop learning plans to address these areas based on the evidence available. Yes/no Yes/no
I use the data generated by digital technologies to reflect on which teaching strategies work well for which kind of learners and adapt my teaching strategies accordingly. Yes/no Yes/no
I reflect on, discuss, re-design, and innovate teaching strategies in response to the digital evidence I find as it concerns learners' preferences and needs as well as the effectiveness of different teaching interventions and learning formats. Yes/no Yes/no

APPENDIX B. POST-COVID ASSESSMENT METHODS (RESPONSES)

In Table B1, the participants' responses to the post-COVID assessment methods are given.


TABLE B1: Post-COVID assessment methods responses

What Did You Change?
More alternative assessments, e.g., videos
More options of representation and engagement in line with principles of the UDL approach
Introduced alternative to present final assignment and an online blog post instead of the traditional written report; also enhanced evaluation rubrics
Allowed for longer times and transferred oral presentation to recordings
Full change to online assessment
The final exam went online or was replaced by a number of online tests
Removed end of term exam, workshop-based assessment, and laboratory reports and replaced them with online quizzes
I dropped the project work and added weekly blogs and more essays to the overall assessment
I adapted assessment type to online and open-book exam; public events organized by students went online
Continuous assessment, four online quizzes following curriculum, group project, final exam (worth less), and final essay as part of the final based on a case study that we looked at throughout the semester
Open-book exams over 24 hours or longer; online forums
Online test/submitted videos/more journals
Created online tests, quizzes, and final exam
No significant changes for advanced students: for first year students I switched to weekly formative quizzes, more substantial online exams, and a more focused essay (shorter with specific instructions); for second year students I used peer feedback on drafts and gave more targeted writing support
Recorded performances, video submissions, and lesson plans; assignments were better aligned to learning outcomes
Reduced number of assessments and introduced online quizzes
The number and weighting of assessments
Inclusion of more continuous assessment through reading groups and class discussions
The class assessment stayed the same: the exam was in the same format but it was submitted online with much more time to work on it, and of course the opportunity to collaborate informally
Replaced final exam with online tests
Some assignments moved online (e.g., presentations).
Increased class assessment
Went back to individual work and essay writing
Written exams moved to online format
Fewer assessments, easier assessments, and more lenient deadlines
Final exam moved online
The end of semester exams moved online
Online group presentations and portfolios
Got rid of the exam, brought in more interim assessments and discussions
Final timed assessment on Sulis in lieu of the traditional exam
Move to 100% coursework on some modules with Sulis quizzes as appropriate to pedagogy
Adapted the final exam and mid-term to an online version
Used more online assessments, including as exam replacement
Discontinued traditional exam
Most/all of the above elements moved to on line mode
During COVID all of the teaching was done online
I reduced the number of assessments
Switched to all in-term essays
Added weekly quizzes as well to encourage/monitor engagement
For the module with 130 students, in which they originally had a paired essay and a final exam, I changed it to a longer individual essay, in which they had to demonstrate their understanding of three topics from the module in their essay (I dropped the exam and there were no changes for the other module)
Midterm and final exams went online or were replaced by mini-quizzes/tests
More variation and choice; more creative
Changed the focus of the need to link in or work with SMES and also changed the percentages allocated to team work
Changed assessment percentages to 40% for the final exam online, 30% for project work, and 30% for tests and quizzes
Project tasks had to be re-designed to work online



† Casting a wider net, a search including multiple databases (e.g., Scopus, Social Science Premium Collection, and IngentaConnect Journals, among others) rendered a still minimal total of 147 results.

Comments

Send comment

Show All Comments
© International Journal on Innovations in Online Education, 2024 Home Streams Printed Issues Webinars About
© Published by Begell House Inc., 2024