Submit Idea, Innovation, Topic for Discussion FACULTY DEVELOPMENT FOR ONLINE LEARNING USING A COGNITIVE APPRENTICESHIP MODEL
Apr 02 2018 Authors: Andrew Wiss, Julie A. DeLoia, Laurie Posey, Noemi Waight, Leonard Friedman
DOI: 10.1615/IntJInnovOnlineEdu.2018025705
Download print version

Andrew C. Wiss

The George Washington University
Address all correspondence to: Andrew C. Wiss, Milken Institute School of Public Health, The George Washington University, 950 New Hampshire Ave. Suite 500 Washington, DC 20037;
awiss@gwu.edu


Julie A. DeLoia

Jefferson College of Health Sciences


Laurie Posey

The George Washington University

Noemi Waight

State University of New York at Buffalo


Leonard Friedman

The George Washington University


Abstract

This study examined the experiences and perceptions of a group of sixteen faculty members who participated in a comprehensive faculty development process for online teaching and learning developed using a cognitive apprenticeship theoretical framework and implemented at the outset of a new online graduate certificate program. The study documented faculty membersʼ self-reported levels of: professional development; changes in instructional approaches; student learning and competency attainment, and impressions of the overall translation of teaching expertise from classroom to the online format. Faculty data regarding their experiences in the program were captured using a 26-question web-based instrument. Data were analyzed using a combination of descriptive statistics and a two-phase qualitative process involving cluster and thematic analysis.

The results of this study revealed that the majority of these faculty members credit program infrastructure and their ongoing consulting relationship with the programʼs instructional designers as having substantial positive and ongoing effects on their online teaching practices. Additionally, this study found that the embedding of experienced instructional design support at all levels of this academic programʼs operations led to the creation of a number of pedagogical and programmatic polices, services and informal supports that helped to clarify faculty roles, allowing faculty to focus on student learning and their own ongoing development of online teaching expertise. The findings from this study are discussed in the context of a cognitive apprenticeship framework and have implications for faculty development efforts to support online teaching and learning, especially in the development of new online graduate programs.

KEY WORDS: CBE, hybrid, non-term, administration, higher education, competency-based education


1. INTRODUCTION

Online course and degree programs have become commonplace throughout the higher education landscape, with notable recent growth in the area of graduate-level programs that prepare professionals for career advancement (WCET, 2016). These programs seek to provide what many institutions believe are innovative new graduate course and program offerings (Allen and Seaman, 2013). As these types of online professional graduate degree programs and courses have continued to proliferate, the literature related to faculty development models to support online learning has remained largely stagnant, with most models focused on instructional design support and training in the use of instructional technologies (Shah et al., 2014; Shattuck, 2009). Both the research and practitioner literature support the value of the instructional design and technology support components of faculty development for effective and productive online teaching.

This mixed methods study examined faculty development within one online competency-based professional program that was guided by a cognitive apprenticeship framework that integrated a comprehensive set of program-wide standards for online teaching and learning and a corresponding set of support mechanisms, services and instructional design personnel to coach faculty at each phase of course design and teaching in this new medium. Central to this comprehensive bundle of program supports was the ongoing collaborative relationship between the instructional design team, the programʼs leadership and each individual faculty member.

2. PURPOSE AND RESEARCH QUESTIONS

This study examined the experiences and feedback of a group of sixteen faculty members who participated in the aforementioned online learning development process which was implemented at the outset of a new online graduate certificate program. More specifically, the study documented the faculty membersʼ self-reported: a) levels of professional development; b) changes in instructional approaches such as assessment and facilitating online interaction; c) observations of the teaching and learning process in terms of student learning and competency attainment, and d) impressions of the overall translation of teaching expertise from classroom to the online format using this newly developed set of course delivery standards and program delivery format. The study addressed four research questions:

  1. What was the overall faculty experience in terms of confidence, satisfaction, and personal development within the process of design and development given this programʼs newly developed delivery standards and delivery format?
  2. In what ways did the role of the programʼs instructional design team (who were embedded in both program leadership and the teaching and learning process) influence the faculty experiences in design, development, and delivery of their courses?
  3. What features of this ongoing course design, delivery, and improvement process were found to be most essential in supporting faculty development?
  4. What aspects of cognitive apprenticeship emerged as the faculty members translated existing teaching and learning skills to the online format and this programʼs structure and culture?

This study provides a window into one academic institutionʼs efforts to develop and improve a competency-based online graduate program. It informs program development, the role instructional design teams can play in the design of instruction, and how instructional designers can support faculty with course technology. These traditional components of online faculty development have been regularly examined across the practitioner and research literature (Moore, 2005) and are generally regarded as important for program success (Allen and Seaman, 2013). This study also documents faculty reflections on their personal and professional development as online educators, and how this programʼs design, the role of administrators, the programʼs culture, one-on-one mentoring, and coaching influenced the culture of online teaching and individual development of faculty expertise. These topics closely relate to recent attempts in the literature to tie socioculturally-oriented management and organizational learning theory to faculty development approaches. This includes the ideas of shaping the job role of online faculty to support their development and practice (Friedman et al., 2017), the value of developing a culture of knowledge sharing and faculty learning communities relating to online teaching (Eib and Miller, 2006) and the value of the coaching relationship between instructional designer and faculty member (Barker, 2003).

3. BACKGROUND

The delivery of graduate-level distance education programs has become increasingly common amongst institutions of higher education (WCET, 2016). The vast majority of these institutions delivering these programs have recognized the need to offer support services for their faculty who are in process of developing and teaching online courses. While some type of support is commonplace (Meyer, 2013), the types of support and services provided may vary based on the emphasis that institution places on the different phases of the course design, delivery and improvement process and the various “soft” and technical skills needed to teach online teaching and successfully interact with students (Clinefelter, 2012; Sammons and Ruth, 2007; Shattuck, 2009; Shah et al., 2014; Yang and Cornelious, 2005). Barker (2003) divides the primary requirements to support online faculty into two categories: 1) instructional design and 2) technology support. Technology support may include hardware training, software training, ongoing technical support and authoring software.

Instructional design support that focuses on the alignment of learning objectives with course structure, content, instructional methods and outcomes, while deemphasizing technology in the early stages of course design and development, has been found to be critical for faculty and tied directly to student learning outcomes (DʼAgustino, 2012; Palloff and Pratt, 2001; Torrisi and Davis, 2000). This design support is often conducted one-on-one with a faculty member using a consulting model. Barker (2003) emphasized the value of extending formal instructional design models to include faculty peer-to-peer mentoring and mentoring by instructional design staff.

Why is instructional design support important in supporting faculty in their transition to online learning? As synthesized by Clay (1999), there are several factors that can negatively influence and inhibit faculty participation in distance education, including: additional time required for course preparation, unclear faculty responsibilities, lack of institutional support, questions surrounding curricular quality and negative faculty perceptions. After 20 years of evolving distance education practices, these same factors are still found to negatively influence the faculty experience, as confirmed by a meta-analysis of faculty perceptions of teaching online from 1995–2015 (Wingo et al., 2017).

While the challenges for online faculty have remained largely static, understanding of how those challenges can be addressed by institutions has evolved. In terms of teaching and learning support, Lancaster and colleagues (2014) found that faculty development programs that feature consultation models with pedagogical experts or faculty peers had the power to “rekindle their motivation and enthusiasm, and improve their knowledge, behaviors, and dissemination of skills” (p. 1). Beyond support for course design and delivery as well as use of technology, there has also been a shift over the past decade to an emphasis on supporting faculty in fostering online interaction with and between students and student-centered design of online experiences, especially as it relates to the achievement of stated learning objectives. Faculty have been shown to be more satisfied with their courses when collaborative support, such as one-on-one consultation with an instructional designer or with peer instructors, is provided to achieve these ends (Puzziferro-Schnitzer, 2005; Wingo et al., 2017). For example, in a study of engineering faculty, Finelli and colleagues (2008) found that instructional consultants could play an important role in helping faculty interpret data regarding the student population and teaching environment to appropriately design learning to meet the needs of students.

The rationale for this study dovetails with evolving evidence that faculty are seeking collaboration with and can benefit greatly from coaching and mentoring from capable instructional designers who can help them to achieve what many would consider progressive changes in the design of their instruction and delivery using new technologies. The value of more comprehensive coaching and mentorship models for instructional support is a recurring theme within the faculty development literature. Kebaetse and Sims (2016) conducted a meta-analysis and noted that coaching, scaffolding, exploration, modeling and reflection were recurring themes and features of different faculty support models. They also noted the similarities between these themes and those that are typically associated with cognitive apprenticeship (Brown et al., 1989; Collins et al., 1991) but that an apprenticeship model, cognitive or otherwise, might not fully depict the successful coaching and mentorship approaches that can be effectively used when working with a faculty member. This, in large part, is because the faculty member is not necessarily training for a pre-defined role as “apprentice” but rather self-identifying areas he or she would like to develop further in collaboration with the instructional designer in a consulting capacity (Brinko, 2012). Issues of power distance and academic hierarchy (Hofstede et al., 2010) between the faculty member and instructional designer may also play a role as the multiple dimensions of the consulting relationship are established and maintained.

Understandings of cognitive apprenticeship are undergirded by theoretical foundations in situated learning and cognition theory (Lave and Wenger, 1991) which emphasize the importance of authentic contexts for learning and the value observing more experienced others performing both physical and cognitive tasks in those contexts. Within those authentic contexts, learners can benefit greatly from legitimate peripheral participation, (Brown et al., 1989) and learning complex skills through guided practice, and eventually improvisation in those authentic contexts with the assistance of expert coaching and mentorship.

This study builds upon previously discussed research, including the value of consultation between faculty and instructional designer, the benefits of utilizing a cognitive apprenticeship type approach and the importance of mentorship and leadership that supports individual development. Along those lines, many of the cultural and operational dimensions of online programs have the potential to be positively influenced by the inclusion of experienced instructional designers, as argued by a faculty working group at MIT (Willcox et al., 2016), who recommend redefining and expanding the role of these types of technological and curricular experts into one of a “learning engineer” who lends guidance to all aspects of an academic programʼs activities.

4. A COGNITIVE APPRENTICESHIP-BASED FACULTY DEVELOPMENT PROGRAM

The Health Information Technology Graduate Certificate Program (HIT Program) was funded in large part by a federal grant and developed primarily to train existing healthcare and technology professionals in the complimentary skills needed to enter the interdisciplinary Health Information Technology (HIT) workplace. At the programʼs outset, program administrators recognized that the program would need to address the needs of students with backgrounds in the healthcare industry and technology systems management. Interdisciplinary skills building would also be required for the HIT programʼs curriculum to be effective. The program administrators recognized early on that achieving these dual aims in an online delivery format within a school where little online learning infrastructure or experience existed would be a challenge.

In response to this challenge, program administrators took the novel approach of embedding two experienced instructional designers within the programʼs administrative leadership team. Each of these instructional designers could be described as expert or senior, with each possessing at least 10-years of experience designing and developing graduate and professional education programs. Both instructional designersʼ previous work had largely focused on delivering computer based and internet delivered learning experiences within the health sciences, engineering, technology and public interest areas. The instructional designersʼ role included membership on the programʼs advisory committee and participation in regularly occurring planning meetings to ensure teaching and learning concerns were considered in all facets of decision making.

At the outset, the instructional design team was tasked with the creation and implementation of a course design, quality improvement and faculty development process to rapidly transition the faculty to the online format. Program leadership encouraged the instructional design team to think broadly about all of the points in the teaching and learning process where faculty might need pedagogical or technical coaching and encouraged the development of structures and systems that could support them in all aspects of their online faculty roles. In collaboration with program leadership, the team created overarching, documented program standards for online teaching and learning, with emphasis on providing consistent high quality in course delivery. The team also established support mechanisms and services to mentor and guide faculty in integrating the standards into their online course designs.

The team took the view that this would be an ongoing learning and performance improvement process for each faculty participant. The goal was for all participants to master a complex set of interrelated soft and hard skills to support their unique disciplinary expertise, beliefs about the instructional process, and preferred methods of interacting with and assessing students. Based on this understanding, the instructional design team used theory-based, informal adult learning and performance support approaches, making use of fundamental principles of andragogy (Knowles, 1984), informal workplace learning (Raybould, 1995), and reflective practice (Schon, 1984; Johns, 2006). In addition, given the call to design and implement a comprehensive process and infrastructure, the team looked to sociocultural models for developing expert performance, specifically situated learning, communities of practice and cognitive apprenticeship (Brown et al., 1989; Wenger et al., 2002). The team developed a set of program practices for the delivery of online courses and considered the types of coaching faculty might need throughout the academic term. The team also developed a set of asynchronous and synchronous course delivery approaches and developed a plan for teaching check-ins during initial offerings of each course and for quality improvement meetings at the end of each course offering. To build trust and rapport and keep faculty engaged and motivated at each point of the process, the instructional design team worked to make each phase of this cycle transparent and seamless for faculty and to be continuously responsive to their individual needs.

This innovative faculty development model was unique in considering the online teaching and learning experience in total, beginning with developing program policies that aligned with the curriculum, course technology, individual course design, course delivery and ongoing course quality improvement. Inviting instructional designers to participate in the administrative planning of this program and expanding their role into the development of informal systems for faculty development enabled them to engage outside their traditional roles as designers of compartmentalized curriculum. This holistic, collaborative model helped to shape the teaching and learning culture and contributed to a high-quality, successful program. Positive outcomes were seen in the 250 students who completed and graduated from the program in a three-year period, many of whom made the transition to the Health IT career space successfully (Preston et al., 2013).

In conjunction with the programʼs Principle Investigator, Academic Program Director and an Advisory Committee comprised of senior faculty and senior university administrators, the instructional design team crafted a program model that aimed to support the comprehensive development of faculty members as online instructors. This began by putting program policies in place that described the methods and duration of content and interaction with students, selecting a standard set of technologies that would support those asynchronous and synchronous interaction methods, and developing a standardized online course template and a rapid paper-prototyping process for course design/re-design that mapped directly to the template using standards derived from the Quality Matters rubric (Quality Matters, 2014).

Initial buy-in from faculty was observed to be a high priority for this process. At the beginning of each new course build, the Academic Program Director and the Instructional Design Team would sit down with each faculty member for a face-to-face meeting where the program standards and course design and development process were introduced. It was also apparent that clear expectations and methods for how these online courses should be delivered and taught influenced faculty buy-in. It is noteworthy that the programʼs leadership made efforts to meet with each faculty member before they began work with the instructional design team, which likely alleviated a good deal of uncertainty about the process, its value and its importance.

The course design process consisted of several collaborative meetings with the instructional design team, where faculty sketched out the weeks of their course in a Microsoft Word planning document that utilized a backwards planning and design model. Those plans for each week of the course and how each weekʼs learning outcomes (observable measures of learning) aligned with content, activities and assessments for each week was developed in consultation with a member of the instructional design team. Following this initial course design phase, the instructional designer further collaborated with faculty on how to best translate existing teaching methods and assignments to the online environment. It was the instructional design teamʼs belief that the redesign of courses for online delivery is also an opportunity to revisit many aspects of the teaching and learning process and make them more active and more effective – regardless of the course delivery approach. The majority of the faculty seemed to embrace this opportunity to revisit the design of their existing assignments with a focus on student achievement of stated learning objectives.

The previously described planning document closely mirrored a program wide online course template that was developed for use on the Universityʼs Learning Management System (LMS). This allowed for two potential benefits: 1. Faculty focused solely on design of instruction before utilizing course technologies and, 2. All planning work could easily be copied and pasted over from the planning document into corresponding placeholders in the online template. Only after the course template was populated were faculty trained in the use of course technologies, including: creating narrated lecture videos, running synchronous sessions with students using the LMSʼs integrated meeting platform, managing tests and other assessments, inputting feedback and grades and managing the course itself.

As the program matured over the course of the first year, a standard set of instructor coaching check-ins and steps were established, including regular check-in meetings to discuss course progress with the instructional design team. The team began holding regular program faculty meetings focused almost solely on the sharing of teaching and learning experiences. These check-ins also served the purpose of coaching faculty through unfamiliar aspects of the course technologies, especially as they related to facilitating interaction with students and the mechanics of assessments.

At the conclusion of either the first or second offering of the course, instructional designers reviewed the online course design using the Quality Matters rubric (2014). This review also considered an analysis of student course evaluations and evidence of learning captured in the learning management system. The result was written report on course quality, areas of success, and areas of suggested improvement. Instructional designers then met with each faculty member to review their course design and its delivery in light of the review findings. From this 60–90 minute dialogue, a quality improvement plan for the course was developed. These plans ranged from a comprehensive full re-design to suggestions on how to increase interactivity between students or how to enhance live sessions using methods such as case discussions.

This study was conducted two years after the launch of the program. As a result, all faculty participants had substantial exposure to the process as previously described before responding to the questionnaire. It should be noted that several of the authors on this paper served in the role of either instructional designer or program administrator. While that personal experience with this process has potential for bias, it also provided the authors with the insights that a) the results of this research are may be helpful and generalizable to other institutions and that, b) the comprehensive nature of the faculty development program and its intent to help faculty translate existing teaching and learning expertise to an online format did in fact establish elements of what Brown, Collins and Duguid refer to as a Cognitive Apprenticeship (1989) as intended by the programʼs instructional designers.

5. METHODS

This study focused on the relationship between instructional designer and faculty member, those same faculty membersʼ perceptions of their own development and the online teaching and learning process as it unfolded over a two-year period. The research team developed a detailed questionnaire to capture faculty perceptions of that process, the changes they may have experienced, the perceived effects on their own teaching processes and their observations of student satisfaction and learning in their course.

6. THE QUESTIONNAIRE

Faculty data regarding their experiences in the program were captured using a 26-question web-based instrument that was developed to capture feedback from participants (attached here as Appendix A). A web-based questionnaire format was selected because eight of the twenty-four faculty who taught for this program were distributed across the United States or reside internationally. In addition, many of the faculty also were professionally employed in management roles with Health IT and technology firms and their participation needed to be accommodated asynchronously via the internet if possible. For these two reasons, face-to-face interview methods would not be possible.

The questionnaire was designed to gather both quantitative and qualitative data. Multiple choice or Likert type questions relating to some level of observed or personally experienced change were followed up with an open-ended format question intended to capture personal reflections on that change. This questionnaire development strategy aligns well with recommendations from Fowler (1995) who describes that direct questions are a preferred method for capturing most data, but that open ended questions have their place when “virtually impossible to answer in a few words” (p. 178) and when the number and type of responses vary greatly.

The questionnaire was distributed after the initial three-year period of the programʼs operation. At this stage, the programʼs policies, design and delivery model had been implemented across its 16 courses and utilized by 24 instructors who taught those courses. The questionnaire began by gathering demographic data and historical data on each participantʼs face-to-face teaching experience before beginning this program. The questionnaire then asked about experiences with distance education prior to engaging with this program. The questionnaire contained several comparative question pairs using Likert scales asking the faculty participants to self-report their personal development before and after engaging with the program. Open ended qualitative responses allowed participants to describe what areas (if any) of this faculty development program had value, where they saw some personal growth as instructors, and whether they had observed any change in student outcomes. The questionnaire was reviewed by a group of four faculty members with experience teaching online for face validity, clarity and general feedback. That groupʼs recommendations were integrated into the questionnaire prior to distribution.

Sixteen of the 24 faculty that were involved with teaching in this online program responded to the questionnaire. Of that group of 16 instructors, 13 had been teaching graduate level courses for five or more years before getting involved with this program. This group fell into three distinct groups of instructors: tenure track faculty (35%), full-time/part-time non-tenure track instructional faculty (35%) and adjunct/practitioner faculty (30%). It should be noted that full-time/part-time non-tenure track instructional faculty in this particular school focus primarily on teaching and have a prominent role alongside their tenure track counterparts in terms of participation in school and departmental meetings, activities and service.

7. ANALYSIS

Data analysis for this study was conducted in three phases. In the initial phase all categorical and numerical data captured through the questionnaire were analyzed using descriptive statistics to identify demographic and experiential properties of the faculty population. Through that initial analysis, the research team was able to identify a number of commonalities and trends within the study population and their experiences, both prior to engaging with this program and afterwards. These trends were used in conjunction with the original organization of the questionnaire to form question clusters, with the intent of performing a cluster analysis of the qualitative comments collected. Those question clusters contained information regarding: demographics, experience, confidence, relationship with instructional designer (ID), value of ID, observed changes in student outcomes and changes in instructional methods.

Qualitative comments captured in the questionnaire were coded based on the research questions being examined in the second phase of analysis. Through that coding analysis, several themes emerged which provided insight into faculty membersʼ personal development while designing and preparing to teach their online course within program standards, their relationship with the instructional design team that coached them through each stage of the process, and changes (if any) in their teaching practices that resulted from participation in this program. In the final phase of data analysis, thematic data was analyzed using question cluster data to identify the relationship between cluster responses and the emerging themes. Of particular interest to our team was examining the relationship between individually reported changes in teaching and learning approaches based on cluster responses.

8. RESULTS

Overall faculty feedback on their experience with the course development process and the ongoing consulting relationship they maintained with the instructional design team was positive and encouraging. Faculty participants self-reported increases in confidence and satisfaction with their online teaching which was also reflected in their positive impressions of the student experience. Faculty also reported improvements regarding student interactions in their courses and student performance as their work progressed with the instructional design team. While the majority of the faculty indicated some type of positive change in their teaching and learning process, the types of self-reported changes were unique to each instructorʼs pedagogical approach, personal goals in bringing their course online, and content area. In general faculty valued their work with, and the support offered by, the instructional designer; this resulted in self-reported positive outcomes in a number of different areas specific to their needs and personal development.

9. FACULTY CONFIDENCE

All but one of the participants came into the program feeling confident with the regular face-to-face classroom, with 62% of the participants expressing that they were “very confident” and 31% expressing that they were “reasonably confident” teaching in that environment. A substantial portion of the group had some online teaching experience, with 50% reporting that they had taught online previously. That online teaching experience came mostly in the form of fully asynchronous courses that relied heavily on self-study and threaded discussion methods for course delivery. Among instructors who had taught online previously, 25% reported being “somewhat comfortable” teaching online, 37.5% reported being “reasonably comfortable” and 37.5% reporting being “very comfortable” online. This self-reported data strongly suggests that the faculty who were selected to participate as instructors in this program were an experienced group who possessed mid-to-high levels of confidence in their teaching abilities.

In terms of previous formal training, this group of instructors was found to reside in one of two groups. One group (46.7%) reported that they had received some formal training, including mentorship or participating in university workshops. The other group (46.7%) reported being “self-taught” which appears largely to have taken place through trial and error in the classroom, personal research into learning theory and preparation for individual conference presentations, suggesting an emphasis on didactic instruction for these instructors.

10. CHANGES IN FACULTY CONFIDENCE IN ONLINE TEACHING

As shown in Table 1, among the 50% of faculty participants who had some online teaching experience, 75% now reported that they felt “very confident” teaching in the online environment after participating in the program. The remaining 25% reported feeling “reasonably confident.” This represents a one-Likert scale increase from “somewhat confident” to “reasonably confident” or from “reasonably confident” to “very confident” for those faculty who indicated that did not begin in the “very confident” self-reported state. This change in faculty confidence took place after engaging in the consulting relationship with the instructional designer for two or more academic terms and teaching within the larger learning ecosystem.


TABLE 1: Faculty Online Confidence Before and After Working with ID.

Confidence Teaching: OnlineBefore Working: with IDAfter Working: with ID
Not very0.00%00.00%0
Somewhat comfortable25.00%20.00%0
I felt reasonably confident37.50%325.00%2
Very confident37.50%375.00%6
Total 88

11. THE CONSULTING RELATIONSHIP AND ITS STRUCTURE

The majority of the programʼs faculty participants had not previously worked with an instructional designer or other curriculum/pedagogy expert, with 86.67% of faculty reporting this was their first experience of this kind. The amount of time faculty invested in the consulting relationship with the instructional designer and the process varied greatly, depending on need, the maturity of the course and interest. All faculty teaching in the program were strongly encouraged to work with the instructional designer – but no expectations were set on the time commitment. The majority of faculty who led the initial design of an online course spent a two-to-three month period of intense work (73.3% of respondents) with the instructional designer planning the course, building out multimedia assets and configuring the course in Blackboard. During this period of time, faculty met weekly with the instructional design team, with many (53.33%) opting for a 30-to-60 minute meeting. A smaller subset committed deeply to the process and identified components of their course they wanted to collaborate more intensively on. These faculty met several times per week with the instructional designer for 2–3 hours (33.3%) and 4–5 hours (13.33%). Those faculty who were later hired to teach those same courses utilized the same online course package that was designed at the outset by the original faculty member, complete with a full syllabus, a fully developed course in Blackboard containing weekly course modules and pre-recorded lecture content, asynchronous course activities that utilized various course technologies and assignments with full rubrics. This made ongoing quality improvement and online teaching coaching during the academic term the primary activities they engaged in with the instructional designer. It is important to note that the questionnaire did not differentiate between original faculty who were responsible for designing the course in collaboration with the instructional designer and those faculty who later inherited and taught and existing online course.

12. VALUE OF THE COLLABORATION

When asked to evaluate the overall value of the collaboration with the instructional designer, 92.31% of faculty reported they found substantial value in the collaboration, with 69.23% selection the highest Likert value of “valuable”. One faculty respondent selected “less valuable”, generating an average for that item of 4.54 out of 5 for value of the collaboration. One faculty member noted, “I cannot understate the value of the quick response and trouble-shooting from local design help; without it I would not have been so ambitious about my own design. And I think the real value was I was able to increase student engagement and interest for better learning outcomes.” This comment underscored the importance of instructional technology support in the broader context of teaching and learning (although this component of the larger faculty development process did not emerge frequently in questionnaire results).

13. IMMEDIATE CHANGES IN ONLINE TEACHING APPROACH

As shown in Table 2, faculty reported that as the relationship with the instructional designer progressed from term to term, a number of components in their online course had been influenced. A majority of faculty indicated that they became more focused on course and content flow, course learning objectives and aligning course content and assignments with those objectives, and elements of course flow and content flow; and reported changes in their assessment of student learning. For some faculty, the instructional design relationship also appears to have influenced their approach to collaborative activities, assignments and applied work.


TABLE 2: Online Teaching Components Influenced by Collaboration with ID.

ComponentsResponses
Course flow and content67%10
Increased focus on learning objectives and design/alignment60%9
Assessment of student learning53%8
Collaborative activities47%7
Assignments and applied work33%5
Multimedia and Instructional Technologies20%3
Total Respondents15

14. LONG TERM PERSONAL DEVELOPMENT

Faculty reported that the consulting relationship with the instructional designers had a substantial impact on their teaching practices, with 93.3% of faculty reporting that that their online instructional practices had changed after working with the instructional designer for one year. More holistically, 66.67% found that that their on-campus teaching practices had also changed based on their experiences with the program and the instructional designer.

15. INTENTIONAL PLANNING AND DESIGN

Half of all respondents noted personal changes in intentional planning of course components, course design and learning objectives. Three of the faculty noted the continued usefulness of the programʼs backwards planning tool, with one of the faculty emphasizing the importance of, “using course design template to outline the entire course design BEFORE I started recording lectures and creating assignments (without the designer I would have just jumped in and started creating products without in-depth design).” Faculty also noted newly found attentiveness to “development of weekly learning objectives (beyond the course objectives)” and thinking “about my courses in a much more systematic way than I did prior to working with the Instructional Designer.” One faculty member in particular made an effort to describe their personal process and development during the relationship with the instructional designer, noting the importance of “reviewing examples and a number of discussions about content and course layout.”

This group of faculty appear to have been influenced by the backward design based method of thinking about curriculum design, as 67% of faculty reported that this process had also changed the ways in which they teach their on-campus courses. Eighty percent of faculty respondents noted that this process had changed the ways in which they go about modifying and improving their existing courses.

16. ASSESSMENT STRATEGIES

Closely tied with course design, a second area that appeared to have experienced a substantial change is in the faculty participantsʼ use of assessment strategies. Eighty percent of the faculty participants reported changing the assessment strategies in some way based on their work in this program with the instructional design team. Of the 11 respondents that chose to elaborate in writing on the changes to their own assessment methods, 7 mentioned a shift away from a single end-of-term high-stakes exam or project towards more varied and frequent assessments, with one faculty member noting a shift towards more, “meaningful formative assessment (also increased engagement) and assess more diverse activities.” For those faculty who employ testing as an assessment method, four noted that they now employ “more frequent quizzes” with two of the faculty noting that they now allow Blackboard to automatically grade those quizzes. Another faculty noted a change in overall assessment philosophy, stating that “I was more focused on student learning goals, rather than a positive assessment.”

17. STUDENT ENGAGEMENT

The third area focused on improving or establishing new methods of student engagement. Sixty percent of those providing written comments indicated long term changes in this area. Three of the respondents indicated a shift in synchronous class sessions to be more interactive and discussion or skills focused with one faculty member noting a “build up (of) student work and engagement especially around (using) Harvard case method online”. Four of the faculty respondents noted that they now utilize an integrated “combination of learning methods,” asynchronous and synchronous engagement strategies, including “in-depth discussion topic for each weekʼs session,” “weekly checklists” and a shift towards required participation that was integrated into the grading scheme. Student engagement appeared to be an area that benefited a good deal from the collaborative design work, with 87% of faculty participants self-reporting some level of improvement in engagement, and 53% of those faculty characterizing those gains as “moderate” or “major.” One of the faculty noted of their synchronous class sessions that, “As student participation increased, the sessions became much more engaging and student learning increased significantly… and they reported high satisfaction with these sessions.”

The majority of faculty (86.7%) indicated that some improvement in student engagement occurred after engaging with the instructional designer in the ongoing quality improvement process that took place at the conclusion of each term: 33.3% noted a major improvement; 20% noted moderate improvement; 33.3% noted minor improvement and 13.3% reported no change. Improved engagement was a recurring theme throughout the questionnaire results, with commentary from several faculty on the consulting relationship such as “I think the real value was I was able to increase student engagement and interest for better learning outcomes” and “As student participation increased, the sessions became much more engaging and student learning increased significantly.” One faculty member noted ongoing challenges adapting to the online teaching environment, citing, “It is still difficult to get some students engaged with an online format but overall the level of engagement has improved”.

18. USE OF INSTRUCTIONAL TECHNOLOGIES

The fourth area where faculty reported influence is in the use of instructional technologies. Three of the faculty noted strategies for using or creating multimedia content. Two others reported insight gained into using the Blackboard LMS for testing with one noting the “functionality in Blackboard/LMS for creation and automatic grading of assignments and tests – requires additional upfront work, but really improves workflow and timeliness of feedback to students.”

19. FACULTY OBSERVED CHANGES IN STUDENT BEHAVIORS AND COURSE EVALUATIONS

Faculty who developed and taught an online course in this program were a part of an ongoing course and teaching quality improvement process, and two of the primary feedback sources were observed student performance and formal feedback from course evaluations. Faculty noted a number of changes in student outcomes as they progressed through this process in consultation with the instructional designer. Notable within these results are the self-reported gains in improved student work product after the collaboration between faculty and the instructional designer. Eighty percent of faculty noted some level of change in observable student output, with 60% of faculty reporting those improvements as “moderate” or “major”. While it is not possible to demonstrate that the relationship is causal, it should be noted that similar levels of improvement were also seen in terms of student engagement in these same courses.

This online program did not follow the regular university calendar and because of this non-standard scheduling, regular course evaluations did not take place every academic term as desired. For those faculty that did receive course evaluations both before and after the first quality improvement cycle was complete (9 of 16 faculty participants), 5 reported an observed increase in student satisfaction reported in evaluation data, with 3 of those instructors noting an improvement of a full Likert scale rating on a 5-point scale for overall student satisfaction with their course. Four reported no substantive change in student evaluation results. Given that many of these online courses were rated highly by students upon their initial release, it may well be that the lack of change reported here is the result of already high ratings (4.0 or higher on a 5-point scale) and that finer grade improvements to the course and subsequent effects on student satisfaction could not be captured from term to term in course evaluations.

20. DISCUSSION

This study contributes a new approach in the form of the expanded role of instructional designers, extending their sphere of influence beyond course design support to a learning ecosystem consultant at all levels of an academic program. This expansion is very much in line with recent recommendations from faculty working groups (Willcox et al., 2016) and researchers alike (Kebaetse and Sims, 2016), who have indicated that pedagogical consultants can have positive impact on the decision making of program administration, provide online teaching coaching for faculty during the course offering, provide quality improvement consultation at various points in the teaching and learning process and recommend shifts in program policies and services based on their close working relationship with faculty. This expanded role of the instructional designer and the formal and informal learning and development supports they provide for faculty are also in line with many of the recommendations for the development of expertise in authentic contexts via cognitive apprenticeship (Brown et al., 1989; Collins et al., 1991). The positive results self-reported by faculty participants here are the end result of building and maintaining this learning infrastructure and faculty experiences with the facilitators of that process, the instructional designers.

Faculty credited improvements in their own instructional confidence and satisfaction with their online courses based on their work with the programʼs instructional design team and related support services. Addressing these affective dimensions of the online faculty experience was considered a priority by the programʼs leadership, who during its inception had some concerns about their facultyʼs acceptance of online learning as a comparable and legitimate alternative to residential course delivery – a challenge facing many distance programs (Clay, 1999; Wingo et al., 2017) that appears to have been adequately met here. Underpinning these affective increases were two very notable overall impacts on teaching and learning: 1) faculty reported an improvement in the student experience and their performance, and 2) faculty underwent personal changes in their teaching and learning practices.

21. CHANGE IN TEACHING AND LEARNING PRACTICES

With the exception of one participant, all faculty in the program indicated that their approach to online course development had evolved through their work in this program and with the instructional design team. The majority of faculty noted that their approach to assessment, course revisions, design and facilitation of course activities had changed as a result of that same relationship. Additionally, 67% of the faculty noted that their on-campus teaching had changed in some way based on their involvement in this program. These self-reported data on personal development as an instructor in both the online and residential formats provide a good deal of affirmation that meaningful faculty development occurred as a result of the integrated set of tools and processes put in place to develop this program. These again appear to map other best practices for online programs by Coburn and Collins (2014) and Clinefelter (2012), including an “orientation to the institution (program)”, “guidance on culture and practice” of teaching and learning, adequate training in fundamental teaching and classroom… skills" (Coburn and Collins, pp. 1–2) and providing clear structures and administrative guidance. Further, providing job role clarification for faculty within complex systems has been noted as beneficial (Friedman et al., 2017).

Beyond these structures, administrative leadership that encouraged full participation in faculty development activities and firmly supported the teaching and learning approach also provided a great deal of encouragement for faculty to stay engaged and develop not only new online courses, but corresponding pedagogical skills as well. This type of leadership engagement can be motivating for faculty (Arenas, 2009) and appeared to validate the ongoing involvement of the instructional design team within each instructorʼs course. This finding is supported by a number of works on the influence of servant leadership and administration on faculty engagement, performance and organizational learning within academic programs (Arenas, 2009; Kezar and Lester, 2009; Russel, 2012).

22. VALUING COLLABORATION

These vast majority (93%) of faculty reported found substantive value in their continued work with the instructional design team. Faculty indicated that they benefited most from coaching and support related to engagement strategies, course design, educational technology selection, assessment strategies, ongoing quality improvement and student-centered teaching approaches. This type of individualized and ongoing consultation that is inclusive of instructorsʼ needs throughout the life-cycle of a course, including design, development, teaching, and learning and quality improvement, has been shown to be an ideal approach to achieve the outcomes described above (Puzziferro-Schnitzer, 2005; Wingo et al., 2017).

Two common themes emphasized during consultation with the instructional design team were putting learning goals first and student-centered approaches to online learning. For some faculty participants who were trained to employ more traditional pedagogies, these closely paired teaching and learning approaches may have been somewhat foreign or initial perceived as “spoon feeding” students. Again, with support from administrative leadership to explore these alternative approaches to teaching their courses, the majority of faculty implemented new or enhanced teaching practices that utilized accepted instructional design principles and an enhanced focus on building up students toward achieving stated learning objectives. This provides a concrete example of the Lancaster et al. (2008) findings that consultative support can motivate, challenge and expand faculty teaching skills and practices.

23. COGNITIVE APPRENTICESHIP

In using a cognitive apprenticeship model to develop this programʼs pedagogical and programmatic supports, the instructional design team was focused on various adult learning approaches that spoke to the development of expertise in authentic contexts, informally in the midst of practice, with the support of experienced consulting and peer support (Johns, 2006; Knowles, 1984; Lave and Wenger, 1991; Raybould, 1995; Schon, 1984). Cognitive apprenticeship seemed an appropriate guiding theory for this programʼs faculty because the majority of instructors would be adopting and utilizing a complex set of skills, new, but related to their existing face-to-face teaching expertise, and learning by doing while developing and teaching their courses. The intention was to provide each faculty member with a set of formal standards and processes for course development at the outset, and as expertise evolved, provide individualized guidance and coaching dependent on needs and level of competency.

Faculty questionnaire responses indicate that elements of a cognitive apprenticeship were established with many of the faculty, especially those willing to engage fully in both the process and their consultative relationship with the instructional design team.

Collins et al. (1989) describes six levels of learning support to establish a cognitive apprenticeship: modeling, coaching, scaffolding, articulation, reflection and exploration. The following mapping of these elements to our study findings demonstrates the establishment of a cognitive apprenticeship.

23.1 Modeling

Although the instructional design team was not in a position to model excellence in online teaching and learning in the traditional sense, modeling took place by establishing standards and delivery models based on accepted standards of quality that were grounded in research evidence and developed through a rigorous, collaborative process that engaged cross-disciplinary experts in online learning (Shattuck et al., 2014).

23.2 Coaching

Faculty placed great value on their ongoing consulting relationship, coaching and collaborative problem solving with the instructional design team. In terms of coaching, faculty regularly worked with the instructional designer to design course activities, select technologies capable of supporting learning objectives, design assessment and engagement strategies and work through difficult teaching and learning issues. It is fair to say that this relationship was more focused on providing translational guidance. In many ways faculty already possessed a great deal of related face-to-face teaching expertise and for the most part a clear vision of what they hoped to achieve.

23.3 Scaffolding and Articulation

Evidence of scaffolding and articulation were clearest during the initial stages of course development. First, a good deal of scaffolding was provided to faculty through program policies documented in the program delivery guidelines which described an operational framework that each course should utilize. Scaffolding and resulting articulation could be seen as each new online course was built using the programʼs paper prototyping model that asked faculty to carefully articulate and align all course content, activities and assessments with stated learning objectives. This in turn led to further articulation and collaboration with the instructional design team as each assignment and activity was planned.

23.4 Reflection and Exploration

Once faculty had completed the course design and development phase, they immediately set to utilizing that work with students in the online classroom. This led immediately to independent exploration and reflection as faculty observed student reactions, engagement, interactions and performance from week-to-week in their course. The feedback loop between instructor and student within these courses, which is primarily meant to benefit student performance, naturally lends itself to reflective practice on the facultyʼs part. It was here that a similar, but higher level, of problem solving took place between the instructional design team and faculty to solve individualized teaching and learning issues that arose. Example issues that were addressed in this way included: adjusting student workload based on negative feedback, while remaining conscious of institutional contact requirements; implementing novel interactive technologies to support student learning or enhance motivation in courses where there was a perceived problem; and enhancing existing strategies for synchronous class meetings when live discussion was not as productive as the instructor had hoped.

The exploration process and the identification of new areas for professional development was largely a faculty-driven process. Instructors identified knowledge and skill gaps they wanted to address to further their own development, a common milestone of expertise and its ensuing improvisation (Collins et al., 1991). One program feature that encouraged deep and rigorous reflection was the modified Quality Matters based course review process. Faculty were presented with a thorough evaluation of their courseʼs design and initial delivery to students. This feedback was used to stimulate reflection on earlier teaching practice and generate discussion on the aspects of their course the instructors wanted to improve.

Based on the above mapping of the faculty experience in this program it appears that many did experience aspects of cognitive apprenticeship in their own development. Yet, due to their senior role in the relationship with the instructional designer and existing face-to-face teaching expertise, the relationship and types of learning that took place was more consultative than a traditional “apprenticeship.” While this is in large part an argument of semantics more than substance, the instructional design team in their consulting role did need to be mindful of that existing expertise and the implied power distance, and factor this into all discussions and recommendations made to faculty. Because of this, faculty development using this approach might also be thought of as a type of “cognitive consultancy” that emphasizes the coaching, reflective and guided explorative elements of a cognitive apprenticeship.

24. LIMITATIONS

The primary limitations of this study were its relatively small sample size and the fact that it was conducted by the same members of the instructional design and administrative team that led the creation of this online academic program and faculty development. Faculty who responded to the questionnaire were aware that their responses would become a part of a study conducted by the programʼs leadership. While this did have the potential to influence responses, no direct evidence was observed during data analysis.

An additional limitation was the use of faculty self-report data for elements such as student engagement and performance and for their own personal development. While faculty impressions of these elements are valuable, they are not nearly as accurate as an analysis of actual student data such as grades or administering a faculty development instrument focused on cognitive apprenticeship in a pre-post fashion at the beginning of the faculty process and again at the two-year mark when this studyʼs questionnaire was disseminated. Given that the research team had been embedded within the program throughout its development, the research team is confident that the questionnaire data were representative of the faculty experience in this program.

25. RECOMMENDATIONS

Based on the overall findings, the following recommendations are offered to support the development of faculty expertise related to teaching and learning, grounded in a cognitive apprenticeship model.

Program Culture and Environment: Cognitive apprenticeships can be most effectively formed when a clear picture of successful performance is presented, when leadership and policy are supportive of individual learning, development and performance.

  • Actively seek out program administration and leadershipʼs support and involvement in the online teaching and learning process. If possible, have senior administrators act as online instructors and early participants in supporting faculty development efforts.
  • Establish a clear set of course design and delivery guidelines for all program courses. In this particular program, these were shared with faculty in the form of a rubric that outlined requirements. By standardizing the operational components of all courses (contact time, communications, general student expectations, etc.), both faculty and students will be free to focus on their roles in the teaching and learning process.
  • Involve instructional designers and other pedagogical experts to participate at all levels of program development, ensuring that the faculty experience and teaching and learning are considered fully as a part of decision making.
  • Provide faculty with ongoing consultative support from pedagogical experts such as experienced instructional designers in the early stages of course development and as an ongoing collaborative resource to build faculty capabilities and expertise over time.

Course Development: Course development models should allow faculty to use their existing teaching expertise and develop that expertise for this new format accordingly.

  • Develop a course design process that emphasizes student learning outcomes and faculty to articulation of those outcomes. This approach allows faculty to actively plan how they will work with and support online students in achieving those outcomes.
  • Allow faculty to use their existing content and teaching expertise by implementing paper prototyping methods to plan out course components before involving any course technologies. Use that prototype as a jumping off-point to develop new improvisations of their existing practices that can be effective in the online format.
  • At all points, consider student-centered approaches to interaction and engagement in the course and visualize how the instructor and the courseʼs design will play a role in facilitating those activities.

Course Delivery and Ongoing Improvement: Encourage reflective teaching practices and support independent practice and learning-while-doing of instructors.

  • Encourage instructional design staff to check-in frequently with faculty to discuss how the academic term is progressing, what challenges are being encountered and to offer collaboration in addressing those challenges.
  • Encourage knowledge sharing and collaboration between program faculty to solve thorny instructional and programmatic issues.
  • Revisit the course after its conclusion and engage in a collaborative process to identify portions of the course that could benefit from revisions of some sort. Utilizing best practices and established course design standards can help to ensure this process is thorough and valid.

26. CONCLUSION

This study demonstrates that online programs can benefit greatly by building an infrastructure and internal culture based on a cognitive apprenticeship model that supports a collaborative process with a focus on desired program outcomes, guided by a set of standards for what teaching and learning should look like during course delivery that is established as a policy anchor for the programʼs development. Based on that established policy, adequate support can be provided to faculty who are new to the challenges of online teaching, with that support informed and provided by experienced instructional designers focused on learning at levels of the program, including the ongoing learning and development of faculty as online instructors.

Key program supports that were observed to have contributed to the positive outcomes reported here include: the establishment of course delivery standards, the selection of course technologies that could support engagement, consultation with faculty to translate existing pedagogies into an online course format that fit those program standards and technologies, providing faculty with a model for successful interaction with students, and ongoing consultative support for faculty and their individual needs to support teaching and learning practices. Furthermore, it was observed that establishing mechanisms for ongoing quality improvement for course design and delivery can lead to rapid improvements in online course quality and student satisfaction, while simultaneously informing program standards as issues are addressed, learned from and integrated into evolving program policy. With these various components in place, a more fruitful professional development environment and consulting relationship can be formed between instructional designers and faculty member.

REFERENCES

Allen, I.E. and Seaman, J. (2013), Changing Course: Ten Years of Online Education in the United States, Babson Park MA: Babson Survey Research Group and Quahog Research Group, LLC.

Arenas, J., Bleau, T., Eckvahl, S., Gray, H., Hamner, P., and Powell, K. (2009), Empowering Faculty to Facilitate Distance Education, Academic Leadership: The Online Journal, 7(1), p. 18.

Barker, A. (2003), Faculty Development for Teaching Online: Educational and Technological Issues, J. Contin. Educ. Nurs., 34(6).

Brinko, K.T. (2012), The Interactions of Teaching Improvement, in K.T. Brinko, Practically Speaking: A Sourcebook for Instructional Consultants in Higher Education, Stillwater, OK: New Forum Press, pp. 3–7.

Brown, J.S., Collins, A., and Duguid, P. (1989), Situated Cognition and the Culture of Learning, Educational Researcher, 18(1), pp. 32–42.

Clay, M. (1999), Development of Training and Support Programs for Distance Education Instructors, Online J. Distance Learning Admin., 2, p. 3.

Clinefelter, D. (2012), Best Practices in Online Faculty Development, The Learning House, Inc.

Coburn-Collins, A. (2014), Best Practices for Supporting Adjunct Faculty, Retrieved from the Higher Learning Commission Collection of Papers, Available at: http://cop.hlcommission.org/Learning-Environments/coburn-collins.html [accessed: 2/13/2018].

Collins, A., Brown, J.S., and Holum, A. (1991), Cognitive Apprenticeship: Making Thinking Visible. American Educator, 15(3), Winter, pp. 6–11, 38–46.

Essays in honor of Robert Glaser, Hillsdale, NJ: Lawrence Erlbaum, pp. 453–494.

Collins, A., Brown, J.S., and Newman, S.E. (1989), Cognitive Apprenticeship: Teaching the Crafts of Reading, Writing, and Mathematics, in Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser, 18, pp. 32–42.

DʼAgustino, Steven (2012), Toward a Course Conversion Model for Distance Learning: A review of Best Practices, J. Int. Educ. Business, 5(2): pp. 145–162.

Eib, B. and Miller, P. (2006), Faculty Development as Community Building, Int. Rev. Res. Open Distance Learning, 3(2), pp. 1–15.

Finelli, C.J., Ott, M., Gottfried, A.C., Hershock, C., OʼNeal, C., and Kaplan, M (2008), Utilizing Instructional Consultations to Enhance the Teaching Performance of Engineering Faculty, J. Eng. Educ., 97(4), pp. 397–411.

Fowler, F.J. Jr. (1995), Improving Survey Questions: Design and Evaluation, Applied Social Research Methods Series, Thousand Oaks, CA: SAGE Publications, vol. 38.

Friedman, B.A., Bonzo, S., and Ketcham, G (2017), Instructor Satisfaction and Motivation in Online Teaching Environments: A Job Design Framework, BRC Academy J. Educ., 7(1): pp. 41–56.

Hofstede, G.H., Hofstede, G.J., and Minkov, M (2010), Cultures and Organizations: Software of the Mind, Maidenhead: McGraw-Hill.

Johns, C (2006), Engaging Reflection in Practice: A Narrative Approach, Oxford: Blackwell Pub.

Kebaetse, M.B. and Sims, R. (2016), Using Instructional Consultation to Support Faculty in Learner-Centered Teaching, J. Faculty Dev., 30(3), pp. 31–40.

Kezar, A. and Lester, J. (2009), Supporting Faculty Grassroots Leadership, Res. Higher Educ., 50(7), pp. 715–740.

Knowles, M.S. (1984), Andragogy in Action, Applying Modern Principles of Adult Education, San Francisco: Jossey Bass.

Lancaster, J.W., Stein, S.M., MacLean, L.G., Van Amburgh, J., and Persky, A.M (2014), Faculty Development Program Models to Advance Teaching and Learning Within Health Science Programs, Am. J. Pharmac. Educ., 78(5), p. 99.

Lave, J. and Wenger, E. (1991), Situated Learning: Legitimate Peripheral Participation, Cambridge, England: Cambridge University Press.

Meyer, K.A (2014), An Analysis of the Research on Faculty Development for Online Teaching and Identification of New Directions, J. Asynchronous Learning Networks, 17(4), pp. 93–112.

Moore, J.C. (2005), A Synthesis of Sloan-C Effective Practices, J. Asynchronous Learning Networks, 9(3), pp. 5–73.

Palloff, R.M. and Pratt, K. (2001), Lessons from the Cyberspace Classroom: The Realities of Online Teaching, Jossey-Bass, San Francisco, CA.

Preston, S., Burke, R., Friedman, L.H., MacTaggart, P., Hamilton, A., Wiss, A., Weider, K. (2013), Bridging the Health IT Workforce Gap: Development and Outcomes of GWʼs Interdisciplinary Health IT Graduate Certificate Program, Proceedings of 141st APHA Annual Meeting and Exposition, Available at: https://apha.confex.com/apha/141am/webprogramadapt/Paper284503.html.

Puzziferro-Schnitzer, M. (2005), Managing Virtual Adjunct Faculty: Applying the Seven Principles of Good Practice, Online J. Distance Learning Admin., 8(2). Available at: http://www.westga.edu/~distance/ojdla/summer82/schnitzer82.pdf [Accessed: 2/13/2018].

Quality Matters (2014), Quality Matters Rubric Standards Fifth Edition, 2014 with Assigned Point Values, Maryland Online. Available at: https://www.qualitymatters.org/node/2305/download/QM%20Standards%20with%20Point%20Values%20Fifth%20Edition.pdf [Accessed: 2/13/2018].

Raybould, R. (1995), Performance Support Engineering: An Emerging Development Methodology for Enabling Organizational Learning, Perform. Improve. Quarterly, 8(1) pp. 7-2.

Russell, E.J. (2012), The Role of Servant Leadership in Faculty Development Programs: A Review of the Literature, Turkish Online J. Distance Educ., 13(1), pp. 15–19.

Sammons, M.C., and Ruth, S. (2007), The Invisible Professor and the Future of Virtual Faculty, Int. J. Instruct. Technol. Distance Learning, 4(1). Available at: http://www.itdl.org/Journal/Jan_07/article01.htm [Accessed: 2/13/2018].

Schon, D.A. (1984), The Reflective Practitioner: How Professionals Think in Action. Aldershot: Ashgate.

Shah, B., Morgan, K.C., Stone, D.E., and Sterling, D (2014), Try TADL: The Award Winning Faculty Development Program, Distance Educ. Report, 18(6), pp. 5–8.

Shattuck, K. (2009), Faculty development: More best practices. Distance Educ. Report, 13(18), 3–6.

Shattuck, K, Zimmerman, W.A., and Adair, D. (2014), Continuous Improvement of the QM Rubric and Review Processes: Scholarship of Integration and Application, Internet Learning, 3(1), Available at: http://digitalcommons.apus.edu/internetlearning/vol3/iss1/5 [Accessed: 2/13/2018].

Torrisi, G.A. and Davis, G.A. (2000), Online Learning as a Catalyst for Reshaping Practice: The Experience of Some Academics Developing Online Learning Materials, Int. J. Academic Dev. 5, pp. 166–76.

WCET (2016), WCET Distance Education Enrollment Report 2016. Available at: http://wcet.wiche.edu/sites/default/files/WCETDistanceEducationEnrollmentReport2016.pdf [Accessed: 2/13/2018].

Wenger, E., McDermott, R.A., and Snyder, W. (2002), Cultivating Communities of Practice: A Guide to Managing Knowledge Boston, Mass: Harvard Business School Press.

Wingo, N.P., Ivankova, N.V., and Moss, J.A. (2017), Faculty Perceptions about Teaching Online: Exploring the Literature Using the Technology Acceptance Model as an Organizing Framework, Online Learning, 21(1), pp. 15–35.

Willcox, K., Sarma, S., and Lippel, P. (2016), Online Education: A Catalyst for Higher Education Reform, Cambridge: MIT.

Yang, Y. and Cornelious, L. F. (2005), Preparing Instructors for Quality Online Instruction, Online J. Distance Learning Admin., 8(1).


Appendix A – Questionnaire: Faculty Questionnaire: Experiences with the Course Development and Continuous Improvement Process


Personal Background

How long have you been teaching at the graduate level?

  • New to teaching
  • 0–2 yrs
  • 3–5 yrs
  • 5 or more yrs

How would you best describe your job responsibilities (select one)?

  • Primarily educator
  • Primarily researcher
  • Primarily administrator
  • Primarily practitioner
  • Mix of educator/researcher
  • Mix of educator/administrator
  • Mix of educator/practitioner
  • Other

Prior to working with an instructional designer, how confident were you in the regular classroom?

  • Not very
  • Somewhat comfortable
  • I felt reasonably confident
  • Very confident

Prior to working with an instructional designer for your online course, had you taught online?

  • Yes
  • No

If so, how confident were you, teaching in the online environment?

  • Not very
  • Somewhat comfortable
  • I felt reasonably confident
  • Very confident

How did you learn how to teach (select one or more)?

  • Trial and error
  • Attended workshops through the university
  • I was mentored by a more experienced faculty mentor
  • I had formal education
  • Other (describe)

Work with an Instructional Designer:

Was this your first experience working with an ID or other curriculum or pedagogy expert/consultant?

  • Yes
  • No

Approximately how many hours per week did you work with the ID?

Approximately how many weeks did you work with the ID?

Overall, how would you rate the value of working with an ID?

  • Valuable
  • Somewhat Valuable
  • Not Valuable

Impact of working with an ID

If you had course evaluations available both before and after working with an ID, how were the evaluations changed?

  • Evaluations went up slightly
  • Evaluations went up significantly (one likert scale category improvement or greater)
  • Evaluations went down slightly
  • Evaluations went down significantly (one likert scale category improvement or less)
  • No change in evaluations
  • Did not have before and after

How did working with an ID change how you approached teaching and learning?

  • Include learning objectives
  • Course content/construction
  • Outside assignments
  • Assessment of student learning
  • Other – please describe (text)

Did you see an overall change in student engagement after your work with the ID?

  • Yes
  • No

How would you rate that improvement?

  • Major Improvement
  • Moderate Improvement
  • Minor Improvement
  • Minor Decrease
  • Moderate Decrease
  • Major Decrease

Did you see an overall change in student work product after your work with the ID?

  • Yes
  • No

How would you rate that change?

  • Major Improvement
  • Moderate Improvement
  • Minor Improvement
  • Minor Decrease
  • Moderate Decrease
  • Major Decrease

Process:

You were introduced to a course and weekly session design process based on accepted best practices. Overall - Did it change the way you plan your instruction?

  • Yes
  • No

If so, what activities you engaged in with the ID influenced your teaching most.

Have you shifted your assessment strategy in any way based on your work with an ID? Some examples might include: more/fewer tests, more/fewer writing or reflection exercises, more/fewer applied activities.

  • Yes
  • No

If so, How?

If so, did you see an improvement in student learning, based on these changes?

  • Yes
  • No

Please describe this change in student learning and your perception of what students are gaining through this change in your assessment strategy.

Evaluation and QI process:

Was the QI meeting helpful (where you were provided with written feedback on your courseʼs design and delivery based on the QM rubric)?

  • Yes
  • No

If so, in what ways?

Did you perceive a positive uptick in student satisfaction with your course, based on the changes you made in the QI process?

  • Yes
  • No

To what effect?

  • Students appeared much more satisfied
  • Students appeared somewhat more satisfied
  • Students appeared slightly more satisfied
  • No change in student satisfaction

Did working with an ID change your approach to continuous improvement?

  • Yes
  • No

In your online teaching online?

  • Yes
  • No
CITED BY
  1. Aggarwal Divya, Elembilassery Varun, WhatsApp Generation in Zoom University: Online Pedagogical Challenges and Innovations, Management and Labour Studies, 2022 Crossref

  2. Pinto Sónia IS, Zvacek Susan M, Cognitive apprenticeship and T-shaped instructional design in computational fluid mechanics: Student perspectives on learning, International Journal of Mechanical Engineering Education, 50, 1, 2022 Crossref

Comments

Send comment

Show All Comments
© International Journal on Innovations in Online Education, 2024 Home Streams Printed Issues Webinars About
© Published by Begell House Inc., 2024