Submit Idea, Innovation, Topic for Discussion STUDENT USE OF SCAFFOLDING RESOURCES IN A HYBRID COURSE: EVIDENCE FROM EYE-TRACKING
Apr 01 2020 Authors: Anna Slavina, Aliye Karabulut-Ilgu, Charles Jahren
DOI: 10.1615/IntJInnovOnlineEdu.2020032355
Download print version

STUDENT USE OF SCAFFOLDING RESOURCES IN A HYBRID COURSE: EVIDENCE FROM EYE-TRACKING

Anna Slavina, 1 Aliye Karabulut-Ilgu, 2* & Charles Jahren 2



1 Department of Psychology, Iowa State University, Ames, Iowa, USA

2 Civil, Construction, and Environmental Engineering, Iowa State University, Ames, Iowa, USA


*Address all correspondence to: Aliye Karabulut-Ilgu, Civil, Construction, and Environmental Engineering, Iowa State University, 813 Bissell Road, 394 Town Engineering Building, Ames, IA, USA, Tel.: (+1) 515-294-5403; Fax: (+1) 515-294-8216, E-mail: aliye@iastate.edu



The inclusion of technology in education has opened the door for more innovative methods of teaching. By integrating technology, instructors can leverage the individualized nature of self-guided learning while maintaining a classroom structure. A commonly used approach to this mix of technology and pedagogy is hybrid learning, which includes a student-driven component in which students interact with online materials outside of the classroom and then engage in collaborative work or problem-solving activities during class time. The purpose of this study is to examine how the students interact with online learning materials as they learn how to solve problems in a hybrid construction engineering course. The students’ online behaviors and gaze movements were recorded using eye-tracking technology, and the findings were analyzed to understand how students used various scaffolding resources embedded in the online environment. The results indicated that the students each developed their unique strategies for accessing resources and completing the online module; however, a single strategy did not emerge as more effective than the others. Rather, the online component of the hybrid classroom provided flexibility and opportunities for self-paced learning.

KEY WORDS: hybrid learning, engineering education, eye-tracking, cognitive scaffolding


1. STUDENT USE OF ONLINE RESOURCES IN A HYBRID COURSE: EVIDENCE FROM EYE-TRACKING

The inclusion of technology in education has opened the door for more innovative methods of teaching. Traditional methods of teaching require students to passively receive information through lectures, which are not tailored to individuals and may not be suitable for all learning outcomes (Chou and Chou, 2011). By integrating technology, instructors can leverage the individualized nature of self-guided learning while maintaining a classroom structure. A commonly used approach to this mix of technology and pedagogy is hybrid learning, which includes a student-driven component. In hybrid learning, students interact with online materials outside of the classroom and then engage in collaborative work or problem-solving activities during class time (Bluic et al., 2007; Humbert, 2007; Porter and Graham, 2015).

Hybrid classrooms tend to rely on self-regulated learning on the part of the students, who are expected to come to class prepared by having engaged in learning activities outside of the classroom. One difficulty that arises when dealing with self-regulated learning is that students generally fail to take the effective approach toward their own learning, particularly when it comes to problem solving (Foster et al., 2017). Previous research concludes that when learning how to solve problems, students may not be able to adopt best practices and take advantage of all of the available tools (Clarebout and Elen, 2004; de Bruin and van Merriënboer, 2017; Jeong and Hmelo-Silver, 2010). The purpose of this study is to examine how the students interact with online modules when learning how to solve problems in a hybrid construction engineering course.

2. BACKGROUND

2.1 Self-Regulated Learning and Scaffolding as a Conceptual Framework

In order to successfully engage in self-regulated learning, a student should go through all four phases described in the framework of self-regulated learning (Devolder et al., 2012). The phases include (1) determining the nature of the problem and engaging with prior knowledge of the task and context; (2) monitoring one’s progress on the task; (3) controlling or modulating one’s actions in accordance with what has already been done and what still needs to be completed; and, upon completing the task, (4) reflecting upon what was done. While this seems like a tall order for any student to do independently, computer-based learning environments are meant to provide an external structure to support students as they go through this process.

The help provided within a computer-based learning environment often includes what is known as scaffolding. Scaffolding is a method of guiding students through problem solving, which entails decreasing the amount of guidance as the student progresses such that at the beginning stages the student may be receiving some assistance, but by the end the student is able to work independently (Devolder et al., 2012; Kim and Hannafin, 2011; Raes et al., 2011). The purpose of scaffolding is to remove some of the unnecessary strain on mental resources, which frees up mental resources to be recruited for task-relevant activities (de Bruin and van Merriënboer, 2017). This idea comes from cognitive load theory, which posits that mental resources are limited and that learning depends on the optimization of how these resources are spent (Paas et al., 2003). There are three types of cognitive load: intrinsic, extraneous, and germane, all of which exist and must be addressed in any learning context.

Intrinsic load is inherent in the task itself and has to do with how the task is completed. When solving a complex engineering problem with multiple steps, the intrinsic load includes the number of steps and the complexity of each one. Extraneous load deals with the environment in which the learning is occurring. If there are too many possible options for what to do or if it is challenging to navigate the learning environment, the extraneous load is considered to be high. Because extraneous load does not contribute directly to learning and takes mental resources away from other tasks, it is generally considered beneficial to minimize the amount of extraneous load in a learning environment. Scaffolding is meant to help reduce the mental burden of being overwhelmed by irrelevant factors by creating a narrow path through the learning environment (Kim and Hannafin, 2011). By decreasing the extraneous cognitive load, scaffolding should help guide the student toward the last type of cognitive load—germane. Germane load is anything that is directly related to the task at hand or the learning outcome. In self-regulated learning, germane load includes the students’ understanding not only of the problem but of their knowledge and progress toward solving the problem.

Scaffolding in learning environments can take on four different forms depending on the information that students are expected to have and the focus of their learning: conceptual, metacognitive, procedural, and strategic (Hannafin et al., 1999). Conceptual scaffolding provides guidance during the initial stages of problem solving when the learner might be overwhelmed with all of the different potential starting points for how to address the problem. The function of conceptual scaffolding is to narrow the scope of the problem to make it possible for the learner to start thinking about what information might be relevant and which knowledge base she should investigate. Metacognitive scaffolding involves the learners’ understanding of how they think about the problem and prompts them to consider a variety of approaches rather than settling on the one with which he might be the most comfortable. Procedural scaffolding provides hints about the learning environment itself and what tools are available to the learner within the environment. Finally, strategic scaffolding can function as guides through the task or problem by breaking the problem into steps.

2.2 Description of the Course

Construction Equipment and Heavy Construction Methods is a junior-level course offered in a hybrid format at a large Midwestern university that requires students to complete online activities (i.e., lectures and modules) before the face-to-face meetings. The course has been taught in this hybrid format since 2012, and the course development process was described elsewhere (Karabulut-Ilgu and Jahren, 2016). The assessments for the course included three major categories: (1) homework, labs, projects, online modules, and presentations (30%); (2) quizzes and class participation (10%); and (3) two midterm exams and one comprehensive final exam (60%).

For the online component, students are required to watch relevant lecture videos and finish online problem-solving modules before class so that they are better prepared to work on complex problem-solving exercises in class. There are 11 lecture videos and 12 modules in total. Lecture videos aimed to provide conceptual knowledge in which content is presented, and students are asked to answer multiple-choice questions inserted into the videos. Online modules, the focus of this study, are developed as example problems, and students are guided throughout the problem to help them understand the solution process. These modules were developed using a content authoring tool, Lectora, and include a problem statement, subquestions, overview videos, and how-to videos (Figure 1). Overview videos further explain the problem statement and outline a solution path. How-to videos describe what needs to be done step-by-step to solve the problem via examples with different numbers. These two videos were considered as the primary scaffolding resources available in the modules. Students submit their answers on the platform (three attempts allowed) and receive feedback. Scores are automatically saved into the course management system and calculated into the final grade. These online modules are intended to provide the basis for more complex homework and lab problems.

FIG. 1: Screenshot of the online module included in the study


2.3 Description of the Scaffolding Resources

The online modules provide students with the conceptual knowledge and structured problem-solving practice that they need to navigate less structured problems successfully during face-to-face classes. Students are tasked with completing the online module prior to engaging in classroom activities because the module itself functions as scaffolding for the less structured activities which are done during class.

The scaffolding in the online learning module described in this study is procedural and strategic. Procedural scaffolding is provided through the inherent structure of modules, which break a problem down into smaller manageable steps. On-demand how-to videos, which are shown as links within the online environment, provide strategic scaffolding as they describe how to approach the problem and suggest factors need to consider—the first phase of self-regulated learning. These scaffolding resources are removed during complex problem solving in class, encouraging students to use the strategies they learned during online tasks. Students experiencing conceptual scaffolding during online problem solving are first directed to consider the concepts related to the problem. Conceptual scaffolding works by narrowing the scope of a problem so that learners could focus on the concepts and problem structures that are relevant rather than having to sift through a much larger pool of topical knowledge in order to determine a solution path. This kind of scaffolding is provided in online lectures, which present relevant conceptual information.

2.4 Research Questions

The overarching goal of the present study is to examine how students make use of the online component of a hybrid course to enhance their learning. In particular, the following research questions are addressed: (1) How do students interact with an online module that utilizes scaffolding in structured problem solving in a hybrid learning environment? 2) Is there a relationship between the use or nonuse of online scaffolding resources and student assessment scores? and (3) What are student perspectives regarding the online modules of the hybrid course?

2.5 Eye-Tracking as a Research Tool in Educational Research

Eye-tracking allows researchers to record and analyze users’ eye movements, which give information about cognitive activities that occur during the learning process in a computer-based environment. The eye-tracker is an electronic device that comes in three varieties: static eye-trackers, head-mounted eye-trackers, and head-mounted eye-trackers with head-tracking. The most common eye-tracker uses the reflections of the pupil and the cornea. As the darkest part of the eye, the pupil’s location can be informative of what information is likely being processed by the viewer (Holmqvist et al., 2011). Generally, eye movements consist of a series of fixations and saccades while reading information or viewing scenes. Fixation refers to a relatively stable state with a lack of eye movement, which means the eyes are locked toward an object. Fixations are determined when there is little to no variance in gaze direction over time and indicate deeper processing of stimuli under investigation (Holmqvist et al., 2011). A saccade is the rapid eye movement between two consecutive fixations, and scan paths display multiple successive fixations and saccades (Lai et al., 2013). These measures may help researchers answer questions related to time, place, and length of cognitive processing (Liversedge et al., 1998). Eye-tracking also gives information about count or frequency (e.g., fixation count, total viewing time). These various measures that eye-tracking technology provides add valuable information about learning processes that cannot be satisfactorily explained by other educational research methods.

As a research tool, eye-tracking has been increasingly used in many disciplines, from gaming to marketing (Horsley et al., 2014). It has also attracted attention from educational researchers in reading and writing (Anson and Schwegler, 2012), problem solving in STEM (Susac et al., 2014), and second language learning (Suvarov, 2014) to name a few. A comprehensive review conducted by Lai et al. (2013) indicated that eye-tracking has been adopted in educational research to examine patterns of information processing, student learning states while interacting with multimedia learning environments, effects of instructional strategies, and individual differences among learners and how such differences influence conceptual development. In a related study on problem solving, researchers used eye-tracking to investigate how learners solve problems during complex tasks, and they found that students spent more time inspecting relevant factors and pay more attention to chosen options on a multiple-choice test (Tsai et al. 2012). However, its use in engineering education has been somewhat limited.

Task engagement in the online environment is an essential key to success in hybrid or flipped courses which require students to use the knowledge gained in the online environment to solve complex problems during class. To understand student preparation, previous research has adopted survey, interview, or click data from the course management systems. However, these types of data sources fail to provide a complete picture of online student behavior because of typical flaws with self-reported data (surveys and interviews) or click data. This study aims to triangulate self-report data with eye-tracking data to explore how students use available scaffolding resources to solve problems in a construction engineering context.

3. METHODOLOGY

3.1 Data Collection and Analysis

This study adopted a quantitative methodology with three data sources: eye-tracking data, survey data, and student grades. For eye-tracking, students were asked to complete one of the online modules in a laboratory setting in order to explore how students use the online modules, how useful they find the modules, and how using the modules might impact learning outcomes. The topic for the module was “Crane Load Charts” and required students to read and locate relevant information on various crane load charts to solve the problem. This module was chosen mainly for two reasons. First, it was due toward the middle of the semester, by which time students were expected to be familiar with the module structure and to use the resources more purposefully. Second, it is a relatively short module that could be finished in a reasonable amount of time in a laboratory setting.

The two scaffolding resources for this specific module were “overview” and “how-to” videos. Two overview videos were provided for students as part of the online module. The first overview video pertained to the first problem, which encompassed five separate questions. The second video pertained to the second problem, which encompassed four separate questions. Nine “how-to” videos were provided throughout the module, each of which corresponded to a specific question within the problem and provided an example of how a similar question could be answered using the charts and graphs that accompanied the problem.

Eye-tracking data were collected using a Tobii Eye Tracker. Participants calibrated the tracker before starting but were free to move their heads and use resources aside from what was available on the computer, such as their notes or calculators. Gaze data were extracted manually by watching videos of participants completing the online module and creating segments out of stationary screens, which occurred whenever a participant was not opening or closing new windows. The resources that were available to the participants in the form of videos and charts/diagrams always opened in new pop-up windows, and participants were free to move and resize the windows as they saw fit. The analyses reported here provide a general overview of participant behaviors rather than specific counts of fixations or saccades because of the participants’ freedom to manipulate the windows they were viewing.

The second data source was an end-of-semester survey that was given to all students in the class. The survey included questions about perspectives on online lectures, online modules, face-to-face sessions, and the overall course. Only the part on online modules and overall course satisfaction were analyzed for this particular study. Finally, grades on relevant homework and lab assignments were stored in order to examine if there was a relationship between online resource use and student performance on assessments. Descriptive statistics were used to calculate the overall counts, percentages, and mean scores. The study was approved by the research ethics committee of the university, and informed consent was obtained from all individual participants included in the study.

3.2 Participants

Out of 23 students who were enrolled in the course, 18 (16 males and 2 females) students volunteered to participate in the eye-tracking data collection. The average age of the students was 21.15 (SD = 0.88). The gender distribution of the participant sample reflected the distribution of the class, with only two females enrolled in the class. Complete eye-tracking data were available for only 14 participants; data for the remaining four were not retained due to technical issues. All 23 students enrolled in the class completed the end-of-course survey.

4. RESULTS

The following is a detailed report of the results framed around the research questions. Note that the survey results are for the class in general rather than for the specific module examined via the eye-tracking method.

4.1 RQ1. Use of Scaffolding Resources in the Online Module

4.1.1 “Overview” and “How-to” Videos

Eye-tracking data indicated that only one participant watched the first overview video, and no participants watched the second overview video. The fact that students did not use this scaffolding might indicate that they were able to engage in self-regulated behavior without needing the scaffolding as they were able to successfully solve the problem. At least one participant watched each how-to video, but few participants watched the entirety of any how-to video (Table 1).


TABLE 1: Number of students who watched part or entirety of how-to videos (P = Problem, Q = Question)

How-to Video Number participants who watched
part of the video (N = 14)
Number of participants who watched
he whole video (N = 14)
P1, Q1 5 3
P1, Q2 8 4
P1, Q3 7 0
P1, Q4 1 0
P1, Q5 1 0
P2, Q6 8 6
P2, Q7 1 0
P2, Q8 6 1
P2, Q9 4 0

According to the survey results, 26% reported watching all of the how-to videos, while 35% reported watching more than two-thirds of them. This indicated that even though students completed the majority of the modules, they were mostly less likely to watch the how-to videos. In the module described in the current study, only 3 out of the 14 eye-tracking participants actually watched two-thirds or more of the how-to videos. Recall that the survey spanned the entire semester, so it may not accurately describe students’ behaviors within a particular module.

4.1.2 Time Spent on the Module

Participants were quite varied in the amount of time that they spent on each question as well as on completing the entire module. The shortest amount of time a participant spent completing the entire module was around 9 minutes, while the longest amount of time spent was around 22.5 minutes. The median time spent was 15.5 minutes. The time spent on each question also varied greatly (Table 2).


TABLE 2: Time spent on each question in the module (P = Problem, Q = Question)

QuestionMedian Time Spent (min)Average Time Spent (min)Range of Time Spent
P1, Q11 min, 55 sec2 min, 47 sec35 sec–9 min, 55 sec
P1, Q23 min, 11 sec3 min, 22 sec26 sec–8 min, 32 sec
P1, Q33 min, 46 sec4 min, 18 sec1 min, 53 sec–9 min, 36 sec
P1, Q423 sec31 sec6 sec–1 min, 19 sec
P1, Q553 sec1 min, 13 sec25 sec–4 min, 16 sec
P2, Q62 min, 31 sec2 min, 43 sec56 sec–5 min, 33 sec
P2, Q752 sec1 min, 11 sec16 sec–4 min, 56 sec
P2, Q81 min, 12 sec1 min, 11 sec26 sec–2 min, 21 sec
P2, Q91 min, 19 sec1 min, 36 sec49 sec–3 min, 17 sec

4.1.3 Charts and Diagrams in the Module

The primary goal of this particular online module was to teach students how to use relevant charts and diagrams to solve problems. In order to successfully arrive at the correct solution, participants were expected to consult the appropriate charts, which they generally did, frequently revisiting the same diagram in an attempt to answer a question.

Various charts and diagrams were available depending on which problem the students were answering. Problem 1, which spanned questions 1–4, included two charts and one diagram. Problem 2, which spanned questions 6–9, included one chart and one diagram. Question 9 replaced the combination capacities load chart with an on rubber crane load diagram. Table 3 shows how many participants viewed each chart and diagram for each question in problems 1 and 2.


TABLE 3: Number of participants viewing charts and diagrams in problems 1 and 2 (N = 14)

QuestionOver Front Crane Load Chart360 Crane Load ChartCrane Range Diagram
P1 Q1677
P1 Q26614
P1 Q31496
P1 Q4700
P1 Q55142
Combination capacities load chartCrane range diagram
P1 Q61414
P1 Q7312
P1 Q8146
On rubber crane load diagramCrane range diagram
P1 Q9144

As can be seen from Tables 1–3, participants varied in their use of videos and their consultations of charts and diagrams to solve the given problem.

4.1.4 Integration of Scaffolding and Other Resources

Eye-tracking was useful for examining the types of information that students might have integrated while completing this module. Ponce and Mayer (2014) referred to eye movements that went between different sources of information (e.g., different parts of the same text) as integrative saccades, during which they argued that participants were integrating information. One of the goals of the online module used in this eye-tracking study was for students to learn how to read the charts and diagrams and extract the required information that was relevant to the question at hand. In order to be able to extract such information, students would need to integrate the information provided in the problem statement and the graphs they were using. Eye-tracking data could inform us about when students were looking between charts, diagrams, and videos, and the problem statement information. Three distinct behaviors were noticed in this regard: no integration, integration with the problem, and integration between multiple resources. No integration refers to times when a chart/diagram or video was opened, and gaze stayed exclusively there [Figure 2(a)]. Integration with problem refers to times when a resource was opened and gaze moved between a resource and the problem statement screen ([Figure 2(b)]. Integration between multiple resources refers to times when the problem statement screen and two resources were open at the same time and gaze moved between them [Figure 2(c)].

FIG. 2: Scan path of integration of scaffolding and other resources


Analysis of the eye-tracking data indicated that all of the participants used at least some of the resources that were available to them to solve the problems in the module. The number of integration opportunities and the number of times participants integrated information across resources as determined by their gaze patterns are displayed in Table 4.


TABLE 4: Number of integration opportunities and number of times participants integrated information across resources

ParticipantTotal Opportunities for IntegrationNo Integration
(% of total)
Integration with Problem
(% of total)
Integration between Resources
(% of total)
11510 (66.7%)5 (33.3%)0 (0%)
22516 (64%)9 (36%)0 (0%)
32211 (50%)11 (50%)0 (0%)
41814 (77.8%)4 (22.2%)0 (0%)
51818 (100%)0 (0%)0 (0%)
62919 (65.5%)7 (24.5%)0 (0%)
73532 (91.4%)3 (8.6%)0 (0%)
83932 (82.1%)7 (17.9%)0 (0%)
9169 (56.3%)7 (43.7%)0 (0%)
102112 (57.1%)8 (37.9%)1 (5%)
112722 (81.5%)5 (18.5%)0 (0%)
122823 (82.1%)5 (17.9%)0 (0%)
132820 (71.4%)8 (28.6%)0 (0%)
142216 (72.7%)6 (27.3%)0 (0%)

Out of 14 participants, 13 viewed a majority of the resources by shifting their gaze outside of the window in which the resource (chart, diagram, or video) popped up. Only one participant (participant 5) viewed all resources without visibly integrating information between the chart/diagram/video and other available resources. The second most common behavior was integrating information between the problem statement or question and one chart/diagram/video that was being used. Of the 14 participants, 13 spent at least some time shifting their gaze between the window displaying the resource they were using and the main window where the problem statement, question, and relevant formulas were displayed. Only two participants integrated between two resources and the problem statement at the same time by shifting their gaze among all.

4.2 RQ2. Relationship between Resource Use and Assessment Scores

Table 5 represents the number of how-to videos watched by each participant and their grades for the module and related homework assignments. Note that three participants did not watch any of the how-to videos, and the most videos watched by any participant was six out of the nine available videos. A majority of participants (10 out of 14) watched fewer than half of the available videos. However, there was not a statistically significant correlation between the number of how-to-videos watched and how well students performed in the overall module and related homework assignments.


TABLE 5: Number of how-to videos watched by each participant and related assessment grades

ParticipantNumber of How-To Videos WatchedModule GradeRelated Homework Grade
HW1HW2HW3HW4
1289100679360
261007710087100
367801004095
441009310093100
50781001009098
661007710087100
708910087100100
83899310093100
921001008793100
1031008787800
1106783878095
12310010073100100
13110087877395
1456710093100100

4.3 RQ3. Student Perspectives on Online Modules

Student perspectives on the modules were highly positive (Table 6). The majority of the survey respondents (87%) indicated that the assigned online modules increased their overall understanding of the course materials. Respondents similarly reported finding the how-to videos helpful, with 100% of respondents indicating that they agreed or strongly agreed that the how-to videos were helpful in the problem-solving process. A near consensus (95%) of survey respondents indicated that the step-by-step problem solution presented in the online modules increased their understanding of the course materials.


TABLE 6: Student perspectives on online modules (N = 23)

ItemsStrongly AgreeAgreeNeither Agree nor DisagreeDisagreeStrongly Disagree
Online modules increased my overall understanding of the material for this course.39%48%9%4%0%
How-to videos in online modules increased my understanding of the problem solution process in a given topic.57%43%0%0%0%
The step-by-step problem solution in online modules increased my overall understanding of the material for this course.52%43%4%0%0%
The feedback I received from the system during modules contributed to my understanding of the material for this course.26%35%26%13%0%
Knowing that I had two attempts to get the right answer for a question decreased my anxiety of making mistakes.30%52%9%4%0%

Overall course satisfaction was examined through three items (Table 7). A big majority of the respondents (87%) were positive that they would be able to retain what they learned in this class. Even though there was some variance in whether students reported wanting to see this type of hybrid teaching in more of their classes, 65% indicated that they would recommend that their friends take hybrid courses.


TABLE 7: Student responses on overall course satisfaction (N = 23) a

ItemsStrongly AgreeAgreeNeither Agree nor DisagreeDisagreeStrongly Disagree
I feel like I will be able to retain what I learned in this class.17%70%9%0%0%
I would like to see this type of hybrid teaching in more of my courses.35%26%17%9%9%
I would recommend taking hybrid courses to my friends.30%35%22%4%4%

a The total is not equal to 100% because of one missing data point.


5. DISCUSSION

Instructors commonly hesitate to consider converting their courses to a hybrid or flipped format because they are not sure how to encourage their students to complete the online activities. To address this concern, several strategies have been employed, such as quizzing students over the online materials (Hew and Lo, 2018) and reviewing the click data provided through course management platforms (Ahn and Bir, 2018). However, these strategies fail to provide sufficient details about student behaviors in an online learning environment, which is an essential part of successfully implementing innovative teaching techniques such as hybrid, blended, and flipped learning.

In this study, eye-tracking technology has been used to explore how students solve problems in the online component of a hybrid construction engineering course. Several findings are worthy of discussion, as they reveal how students use resources to assist their learning. First, students had clear preferences for how to approach the scaffolding resources. The fact that only one student watched one of the two overview videos might imply that the majority of the students found them redundant. How-to videos, on the other hand, were consulted more often. These how-to videos, designed as worked-out problems, seemed to be more helpful in solving the problem (Kalyuga et al., 2001).

Second, the time spent on modules ranged from 9 minutes to 22.5 minutes, which confirmed the claims about the flexibility of online learning where students choose how much time to spend as opposed to conforming to one time requirement set by an instructor in a traditional classroom (Buechler et al., 2014; Kiat and Kwot, 2014; Mok, 2014; Simpson et al., 2003; Velegol et al., 2015). This finding implies that online tasks provide the extra time that some students need and create a self-paced learning environment (Karabulut-Ilgu and Jahren, 2016). Another resource used in this module was the charts and diagrams that were required to solve a particular question. The results indicated that participants were able to locate the relevant charts to solve the problem correctly. Online task design facilitated student learning, as it allowed for the integration of various resources in one platform by weaving these resources into learning activities through the design of instruction (Greene and Land, 2000).

Finally, findings indicated that some students viewed the resources independently while some others integrated them either with the problem statement or other resources (i.e., chart, video). This provided support for metacognition and enabled students to develop their own learning strategies through the use of scaffolding (Scardamalia et al., 1989). However, none of these strategies were related to performance on the module or other relevant assignments, which means that any given strategy was not more effective than the other. Possibly, each student customizes a strategy that they believe works best for themselves. This confirms the earlier research by Bos et al. (2015) that online task engagement had a relatively low predictive value on exam performance.

6. PEDAGOGICAL IMPLICATIONS

Based on the findings of this study, the following pedagogical implications can be drawn for online task design in a hybrid course format:

Hybrid course design provides opportunities for flexibility and individualized learning. The high variance in the duration of task completion confirmed the claims about the benefit of online learning providing opportunities for self-paced learning. Instructors may choose to develop online activities for particularly challenging concepts for which some students may need the extra time to comprehend and practice.

Students may need training on how to effectively utilize the online resources. An important lesson to take away from this study is that students may not necessarily employ the strategy around which instructors design their online learning modules. While it is useful to have a particular approach in mind when designing lessons, instructors should keep in mind that without explicit instruction on how students are expected to navigate the lessons, individual students may settle on whatever approach makes the most sense to them. The individual strategies employed by the students may work for the individual module but might not be the best strategies for learning. It may be helpful to provide students with a mandatory tutorial on how to integrate information across sources if that is an expected educational outcome.

7. CONCLUSION

This study aimed to reveal how students solved problems in the online component of a hybrid course, how resource utilization impacted student performance, and overall student perspectives on the online component of a hybrid course. To this end, we surveyed students and examined how much time they spent on the module and how they used the scaffolding resources integrated into the online platform. Overall, students reacted positively to hybrid learning and opined that the online tasks contributed to their learning. There was not a statistically significant relationship between the way students interacted with the online tasks and how they performed on relevant course assessments. The results indicated that students used the majority of the resources in one way or another, which implied that scaffolding in an online platform facilitated student learning. However, there did not seem to exist a pattern for resource consultation. Instead, each student developed his/her own strategies to integrate information from different sources in order to solve the problems.

7.1 Limitations and Directions for Further Research

As with any research, this study has some limitations to consider while interpreting the results. First, the study was completed in a laboratory environment, and students knew they were being recorded, which might have impacted how they completed the task and may not reflect the actual student behavior. To address this concern, several precautions were taken. Participants were recruited on a voluntary basis, and data were collected by individuals who do not have direct relationships with students. Students were also assured that their grades would not be affected by any means because of what they did in the experiment.

Another limitation stems from the fact that data were collected only for one online module. The researchers tried to choose a representative module, but students might complete the other modules differently. In the future, more data could be collected from other modules, and deeper statistical analyses could be conducted on a larger data set to have a better understanding of overall online resource use and how it impacts student performance on assessments.

Even though this study contributes to the knowledge base in the field through the use of an innovative research technique—eye-tracking—additional data could have been collected through think-aloud protocols and follow-up interviews with students to further shed light on why students do what they do while engaged in an online component of a hybrid course.

REFERENCES

Ahn, B. and Bir, D.D., Student Interactions with Online Videos in a Large Hybrid Mechanics of Materials Course, Adv. Eng. Ed., vol. 6, no. 3, pp. 1–24, 2018.

Anson, C.M. and Schwegler, R.A., Tracking the Mind’s Eye: A New Technology for Researching Twenty-First Century Writing and Reading Process, College Compos. Commun., vol. 64, no. 1, pp. 151–171, 2012.

Bluic, A., Goodyear, P., and Ellis, R.A., Research Focus and Methodological Choices in Studies into Students’ Experiences of Blended Learning in Higher Education, Internet Higher Ed., vol. 10, pp. 231–244, 2007.

Bos, N., Groeneveld, C., van Bruggen, J., and Brand-Gruwel, S., The Use of Recorded Lectures in Education and the Impact on Lecture Attendance and Exam Performance, British J. Ed. Technol., vol. 47, no. 5, pp. 906–917, 2015.

Buechler, D.N., Sealy, P.J., and Goomey, J., Three Pilot Studies with a Focus on Asynchronous Distance Education, Proc. of 121st ASEE Annual Conf. and Exposition, Indianapolis, IN, 2014.

Chou, A.Y. and Chou, D.C., Course Management Systems and Blended Learning: An Innovative Learning Approach, J. Innov. Ed., vol. 9, no. 3, pp. 463–484, 2011.

Clarebout, G. and Elen, J., Tool Use in Computer-Based Learning Environments: Towards a Research Framework, Comput. Human Behav., vol. 22, pp. 389–411, 2004.

de Bruin, A.B.H. and van Merriënboer, J.J.G, Bridging Cognitive Load and Self-Regulated Learning Research: A Complementary Approach to Contemporary Issues in Educational Research, Learning Instruct., vol. 51, pp. 1–9, 2017.

Devolder, A., van Braak, J., and Tondeur, J., Supporting Self-Regulated Learning in Computer-Based Learning Environments: Systematic Review of Effects of Scaffolding in the Domain of Science Education, J. Comput. Assist. Learning, vol. 28, pp. 557–573, 2012.

Foster, N.L., Rawson, K.A., and Dunlosky, J., Self-Regulated Learning of Principle-Based Concepts: Do Students Prefer Worked Examples, Faded Examples, or Problem Solving?, Learning Instruct., vol. 55, pp. 124–138, 2018.

Greene, B. and Land, S., A Qualitative Analysis of Scaffolding Use in a Resource-Based Learning Environment Involving the World Wide Web, J. Ed. Comput. Res., vol. 23, no. 2, pp. 151–179, 2000.

Hannafin, M., Land, S., and Oliver, K., Student-Centered Learning Environments, in Instructional-Design Theories and Models: Vol. 2. A New Paradigm of Instructional Theory, C.M. Reigeluth, Ed., Mahwah, NJ: Erlbaum, pp. 115–140, 1999.

Hew, K.F. and Lo, C.K., Flipped Classroom Improves Student Learning in Health Professions Education: A Meta-Analysis, BMC Med. Ed., vol. 18, pp. 38–50, 2018.

Holmqvist, K., Nystrom, M., Andersson, R., Dewhurst, R., Jarodzka, H., and van de Weijer, J., Eye-Tracking: A Comprehensive Guide to Methods and Measures, New York City: Oxford University Press, 2011.

Horsley, M., Eliot, M., Knight, B.A., and Reilly, R., Current Trends in Eye Tracking Research, Berlin: Springer, 2014.

Humbert, M., Adopting of Blended Learning by Faculty: An Exploratory Analysis, in The Challenges of Educating People to Lead in a Challenging World, M.K. McCuddy, Ed., Amsterdam: Elsevier Science B.V., pp. 423–436, 2007.

Jeong, H. and Hmelo-Silver, C.E., Productive Use of Learning Resources in an Online Problem-Based Learning Environment, Comput. Human Behav., vol. 26, pp. 84–99, 2010.

Kalyuga, S., Chandler, P., and Sweller, J., Learner Experience and Efficiency of Instructional Guidance, Ed. Psychol., vol. 21, pp. 5–23, 2001.

Karabulut-Ilgu, A. and Jahren, C., Evaluation of Hybrid Instruction in a Construction Engineering Context: A Mixed-Method Approach, Adv. Eng. Ed., vol. 5, no. 3, 2016.

Kiat, P.N. and Kwot, Y.T., The Flipped Classroom Experience, Proc. of IEEE CSEE and T, Klagenfurt, Austria, pp. 39–43, 2014.

Kim, M.C. and Hannafin, M.J., Scaffolding Problem Solving in Technology-Enhanced Learning Environments (TELEs): Bridging Research and Theory with Practice, Comput. Ed., vol. 56, pp. 403–417, 2011.

Lai, M., Tsai, M., Yang, F. Hsu, C., Liu, T., Lee, S.W., Lee, M, Chiou, G., Liang, J., and Tsai, C., A Review of Using Eye-Tracking Technology in Exploring Learning From 2000-2012, Ed. Res. Rev., vol. 10, pp. 90–115, 2013.

Liversedge, S.P., Paterson, K.B., and Pickering, M.J., Eye Movements and Measures of Reading Time, in Eye Guidance in Reading and Scene Perception, G. Underwood, Ed., Amsterdam: Elsevier Science, Ltd, pp. 55–75, 1998.

Mok, H.N., Teaching Tip: The Flipped Classroom, J. Inf. Syst. Ed., vol. 25, pp. 7–11, 2014.

Paas, F., Tuovinen, J.E., Tabbers, H., and Van Gerven, P.W.M., Cognitive Load Measurement as a Means to Advance Cognitive Load Theory, Ed. Psychol., vol. 38, no. 1, pp. 63–71, 2003.

Ponce, H.R. and Mayer, R.E., An Eye Movement Analysis of Highlighting and Graphic Organizer Study Aids for Learning from Expository Text, Comput. Human Behav., vol. 41, pp. 21–32, 2014.

Porter, W.W. and Graham, C.R., Institutional Drivers and Barriers to Faculty Adoption of Blended Learning in Higher Education, British J. Ed. Technol., vol. 47, no. 4, pp. 748–762, 2015.

Raes, A., Schellens, T., De Wever, B., and Vanderhoven, E., Scaffolding Information Problem Solving in Web-Based Collaborative Inquiry Learning, Comput. Ed., vol. 59, pp. 82–94, 2011.

Scardamalia, M., Bereiter, C., McLean, R., Swallow, J., and Woodruff, E., Computer-Supported Intentional Learning Environments, J. Ed. Comput. Res., vol. 5, no. 1, pp. 51–68, 1989.

Simpson, W., Evans, D., Eley, R., and Stiles, M., Findings from the HEI “Flip” Project: Application Issues, Int. J. Continuing Eng. Ed. Lifelong Learning, vol. 13, p. 471, 2003.

Susac, A., Bubic, A., Kaponja, J., Planinic, M., and Palmovic, M., Eye Movements Reveal Students’ Strategies in Simple Equation Solving, Int. J. Sci. Math. Ed., vol. 12, no. 3, pp. 555–577, 2014.

Suvarov, R., The Use of Eye-Tracking in Research on Video-Based Second Language (L2) Listening Assessment: A Comparison of Context Videos and Content Videos, Language Testing, vol. 32, no. 4, pp. 463–483, 2014.

Tsai, M.-J., Hou, H. T., Lai, M. L., Liu, W.-Y., and Yang, F.Y., Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis, Comput. Ed., vol. 58, no. 1, pp. 375–385, 2012.

Velegol, S.B., Zappe, S.E., and Mahoney, E., The Evolution of a Flipped Classroom: Evidence-Based Recommendations, Adv. Eng. Ed., vol. 4, pp. 1–37, 2015.

Comments

Send comment

Show All Comments
© International Journal on Innovations in Online Education, 2024 Home Streams Printed Issues Webinars About
© Published by Begell House Inc., 2024