Abstract

The purpose of this study was to survey undergraduate students of a management statistics course about their perceptions on the usefulness of a series of instructional videos with embedded quiz questions created by their instructor to provide students with knowledge of Excel functions needed to understand the course materials and complete course work. Due to COVID-19, the intended study was split between academic years and three different course types (in-person pre-COVID-19, fully-online post-COVID-19, in-person post-COVID-19), all taught by the same instructor, were surveyed resulting in a study that compared three different course types rather than a simple replication study. The results of the survey showed that there were significant positive changes to perceptions of video quizzing usefulness for students as they progressed through the in-person pre-COVID-19 course, but not significant differences for students who progressed through the two other course. In comparing the courses with each other, the only area in which all three had significant differences to each other was in students feeling the video quizzes enabled them to skip synchronous sessions. This was the only area in which the pre-COVID 19 course had any significant differences from the two post-COVID-19 courses, but the two post-COVID-19 courses were found to be significantly different from each other on almost every item with the fully-online students rating all items higher than the in-person students did. The researchers of this study felt that these results pointed to two major areas for future study; one being on how class modality post-COVID-19 impacts student perceptions of online tools and the other being related to the value and replicability of the in-person experience among those who were forced into remote learning during the pandemic.

Overview

In Fall 2018, the instructor of an undergraduate management statistics class came to the Center for Teaching and Learning with a problem related to an inequality in student entry knowledge related to using Excel. The instructor found that a great deal of time was spent during lecture sessions showing students the basics of how to manipulate data in Excel, despite that not being the focus of the course itself. As a solution, the instructor had created a series of videos to walk students through the basics of Excel used in the course and aid those who entered the course without this knowledge. The instructor found that those students who used the videos did well in the course, but that not all students who needed the knowledge the videos provided would use them. This problem led the instructor to enter into a technology partnership with the Center for Teaching and Learning.

Instructor generated videos have long been seen as a great supplement to a live lecture serving either as a prelude to the lecture meant to be viewed prior to class or as a follow up to the lecture meant to serve as a refresher or study aid for an exam. Studies have been done trying to understand the impacts of instructor generated videos and to inform best practices around their creation with a particular emphasis in recent years understanding what students consider an acceptable video length, how to ensure students watch a video to completion, and how you can make videos useful for students (Guo, Kim, & Robin, 2014; Mayer, & Moreno, 2003). Initial research by Guo, Kim, and Robin (2014) suggested that multiple shorter videos was preferred and that longer videos had a drop off point at which students would not watch a video to completion. However recent studies have suggested that this is not always the case and have sought to better understand ways to get students to fully consume longer videos through various means including adding interactive elements (Geri, Winer, & Zaks, 2017; Lagerstrom, Johanes, & Ponsukcharoen, 2015).

This solution of utilizing interactive elements to make the videos an active learning task as opposed to a passive task seemed like something the instructor wanted to explore. In researching it further, various studies pointed to the practice of embedding quiz questions within a video as an effective means of increasing interactivity and providing various other benefits such as improvement on future exams (Cummins, Beresford, & Rice, 2016; Delen, Liew, & Wilson, 2014; Griswold, Overson, & Benassi, 2017; Kleftodimos, & Evangelidis, 2018; Rice, Beeson, & Blackmore-Wright, 2019; Sumter, 2018; Zhang, Zhou, Briggs, & Nunamaker Jr, 2006). The notion of providing frequent quizzing opportunities is also supported in classic educational research and commonly seen to be beneficial in general (Gaynor, & Millham, 1976, Roediger, & Karpicke, 2006; Zaromb, & Roediger, 2010). This led to the decision to explore the solution of creating and embedding quiz questions into instructor created videos as a means to increase their usage by students in the management statistics course.

Through the partnership, the instructor developed a series of online video quizzes and tutorials which turned out to be very popular with students according to initial surveys. Students had access to the collection of online learning video quizzes which were 9-20 minutes long and focused the most important applications and concepts that students seemed to struggle with. In these videos, students could watch the instructor solve problems using Excel in real time and follow the solution, at their own pace, mimicking the steps within the video. Surveys were provided to students during this initial partnership to gauge the effectiveness of the frequent online video quizzes and determine various elements of best practice such as question frequency, question relevance, and other various factors. These survey responses proved valuable in guiding creation and usage of the video quizzes. The success of this solution was so great, that it was mutually decided that the solution should be formally studied and evaluated.

COVID-19 Interruption & Transformation

After developing best practices for the video quizzes with student input during the 2018-2019 school year, the researchers obtained IRB approval and began the study in Fall of 2019 with the intent to provide a survey to course sections delivered in Fall 2019 and Spring 2020. Then the COVID-19 pandemic occurred in the middle of data collection. As the pivot to remote learning was a very difficult time for students and faculty, the researchers decided to postpone data collection and did not utilize any data from Spring 2020. The next few semesters were particularly difficult, and researchers did not seek to restart data collection until Spring 2021 when courses had begun to stabilize and course delivery became consistent again.

While data collection was restarted in Spring 2021, researchers noted that the classroom environment had been altered by the pandemic in a number of ways. For starters, the course itself had many updates to its structure and the content developed for it. The major changes included: an increase in the quality and number of course videos to the point where every lesson had an associated video and there was a video quiz provided every week, an increase in overall organization and consistency including a weekly schedule of class activities, the addition of weekly homework where students submit Excel solutions, and the addition of a participation grade that requires submission of Excel solutions and conceptual answers for every problem presented during a synchronous session. While this structure and content were developed during the pandemic and implemented when the class was offered fully-online in Spring 2021, the same structure and content was maintained for Fall 2021 when the course was offered in-person again for the first time since COVID-19 forced a shift to online; with the one exception being that in Fall 2021 synchronous class sessions were held exclusively in-person as opposed to Spring 2021 when they were held online via video conferencing.

Aside from the changes to the course itself, researchers note that the pandemic experience likely had changes both on students and how they perceive and interact with the video quizzing tools present in this study. It is the acknowledgement of researchers that likely these changes will likely have unknown impacts, but it is the hope of this study that some of these changes may be come to the surface and present themselves for more direct future investigation. In the end, instead of a simple replication study on the perceptions of video quizzing, this study became a comparison of video quizzing perceptions across three different course types and a look at the possible effects of the pandemic disruption.

Design and Procedures

A Qualtrics survey was devised which principally sought to ascertain student perceptions of the usefulness of video quizzing related to completing homework, understanding concepts, improving Excel skills, studying for exams, reinforcing lecture lessons, allowing them to skip the lecture session, or being an overall effective learning resource. Student volunteers participating in the study were given a link to the online survey which consisted of seven four point forced-choice Likert scale that ranged from “Disagree” to “Agree” that addressed the various areas of usefulness being investigated as well as other questions not reported in this paper such as open-ended questions.

The population for this study were students from Georgia Institute of Technology attending an undergraduate course on management statistics that was offered by the same instructor before and after COVID-19 and both as in-person and online. For each course section, students were given the opportunity to participate in the survey twice, once after the first exam and again after the third exam. Of the n = 237 students who participated in this study n = 64 participated in the in-person Fall 2019 (pre-COVID-19) group, n = 84 participated in the fully-online in Spring 2021 (post-COVID-19) group, and n = 89 participated in the in-person Fall 2021 (post-COVID-19) group.

Research Questions

  1. Is there a significant difference between student perceptions of the usefulness of video quizzing as it relates to completing homework, understanding concepts, improving Excel skills, studying for exams, reinforcing lecture lessons, allowing them to skip the lecture session, or being an overall effective learning resource, when surveyed after the first exam and their perceptions after their third exam?

  2. To what extent is there a significant difference between course type (in-person pre-COVID-19, fully-online post-COVID-19, in-person post-COVID-19) and student perceptions of the usefulness of video quizzing as it relates to completing homework, understanding concepts, improving Excel skills, studying for exams, reinforcing lecture lessons, allowing them to skip the lecture session, or being an overall effective learning resource?

Results

Independent samples t-tests were used to compare student perceptions of video quizzing usefulness after students’ first and third exams within each course type.

For the in-person pre-COVID-19 course, student perceptions of video quizzing related to completing homework (t(62)= -3.337, p <0.05), understanding concepts (t(62)= -4.211, p =0.00), improving Excel skills (t(62)= -3.299, p <0.05), studying for exams (t(62)= -3.999, p =0.00), reinforcing lecture lessons (t(62)= -4.340, p =0.00), allowing them to skip the lecture session (t(62)= 2.981, p <0.05), and being an overall effective learning resource (t(62)= -4.712, p =0.00), were significantly higher for those surveyed after their third exam than those surveyed after their first exam. The effect size of completing homework (d’=.91), understanding concepts (d’=1.16), improving Excel skills (d’=.88), studying for exams (d’=1.06), reinforcing lecture lessons (d’=1.22), and being an overall effective learning resource (d’=1.27) were all found to be higher than a large effect size (d’=.80); allowing them to skip the lecture session (d’=.76) was found to be higher than a medium effect size (d’=.50) (Cohen, 1988).

For the fully online post-COVID-19 course, student perceptions of video quizzing related to allowing them to skip the lecture session (t(82)= 1.995, p <0.05) was significantly higher for those surveyed after the third exam than those surveyed after the first exam. The effect size of allowing them to skip the lecture session (d’=.56) was found to be higher than a medium effect size (d’=.50) (Cohen, 1988). There were no other statistically significant findings for the independent samples t-test for the fully online post-COVID-19 course nor were there any for the in-person post-COVID-19 when comparing student perceptions of video quizzing usefulness after their first and third exams within each course type.

Independent samples t-tests were used to compare student perceptions of video quizzing usefulness after students’ third exams between each of the three course types.

When comparing the in-person pre-COVID-19 course to the fully online post-COVID-19 course, student perceptions of video quizzing related to allowing them to skip the lecture session (t(40)= 2.587, p <0.05) was found to be significantly higher for those surveyed in the fully online post-COVID-19 course than those from the in-person pre-COVID-19 course. The effect size (d’=0.81) was found to be slightly higher than a large effect size (d’=.80) (Cohen, 1988). There were no other statistically significant findings between these groups.

When comparing the in-person pre-COVID-19 course to the in-person post-COVID-19 course, student perceptions of video quizzing related to allowing them to skip the lecture session (t(58)= -5.661, p <0.05) was found to be significantly higher for those surveyed in the in-person pre-COVID-19 course than those from the in-person post-COVID-19 course. The effect size (d’=0.71) was found to be higher than a medium effect size (d’=.80) (Cohen, 1988). There were no other statistically significant findings between these groups.

When comparing the fully online post-COVID-19 course to the in-person post-COVID-19 course, student perceptions of video quizzing related to completing homework (t(54)= 2.063, p <0.05), understanding concepts (t(54)= 2.272, p <0.05), improving Excel skills (t(54)= -2.179, p <0.05), studying for exams (t(54)= 2.871, p <0.05), allowing them to skip the lecture session (t(53)= -5.661, p = 0.00), and being an overall effective learning resource (t(53)= 2.764, p <0.05), were found to be significantly higher for those surveyed in the fully online post-COVID-19 course than those from the in-person post-COVID-19 course. The effect size of completing studying for exams (d’=.83), allowing them to skip the lecture session (d’=1.68), and being an overall effective learning resource (d’=.81), and were all found to be higher than a large effect size (d’=.80); homework (d’=.60), understanding concepts (d’=.53), and improving Excel skills (d’=.66) were found to be higher than a medium effect size (d’=.50) (Cohen, 1988).

Conclusion and Future Studies

The results of the analysis answered the two research questions. With regard to the first question which looked at how, if at all, student perceptions of video quizzing usefulness changed during the semester from after the first exam to after the third exam, it was noteworthy that the only semester with multiple areas of significant difference was the one in-person pre-COVID-19 course from Fall 2019 whose results indicated that students felt video quizzing was more useful across all items the further into the course they were. This seems a logical result given that typically we see an information scaffolding effect where courses become more difficult the further you progress, and materials could have been seen as more helpful as a result. However, aside from the ability to skip lecture sessions in the fully-online post-COVID-19 course, no significant differences existed within the groups of either the fully-online post-COVID-19 course, or the in-person post-COVID-19 course. While this needs more investigation, it is possible that the changes between the Fall 2019 course structure and those of the Spring and Fall 2021 course structures resulted in a course that is far more consistent and evenly paced throughout and then this in turn resulted in students viewing the usefulness of materials like video quizzing as equal in usefulness throughout the semester.

In regard to the second question which looked at differences in student perceptions between the three different course types, it is worth pointing out that the only significant difference found between all three types was the perception that one was able to skip a lecture session by using the video quizzes. Among the course types, students most agreed to this statement in the fully-online post-COVID-19, followed by the in-person pre-COVID-19 course, and finally the in-person post-COVID-19 course. It is not surprising that the fully-online course, which offered lecture sessions as virtual synchronous video conference meetings, would be perceived as more easily skipped than an in-person course due to the similarities of attending a video conference and watching a video as well as the plethora of content, structure, and access granted to students outside of sessions.

By contrast, the difference between the results of the two in-person courses is very surprising and went against researcher expectations. The in-person post-COVID-19 course offered students the same content as the fully-online course and provided more access and clarity to students outside of class than the in-person pre-COVID-19 course did, yet, despite students in the fully-online version agreeing they could skip sessions at high rates, fewer students in the in-person post-COVID-19 course reported feeling they could skip sessions than they did in the in-person pre-COVID-19 course. This difference in the rating of being able to skip is interesting as it is the only area in which we see a significant difference between the two in-person course types, and, since the pre-COVID-19 course did not have any other significant difference in usefulness perceptions between it and either of the post-COVID-19 courses, it is possible that this difference in perception could be a related to students having gone through the remote learning experience that COVID-19 brought about. Further study in feelings related to the value and replicability of the in-person experience among those who were forced into remote learning during the pandemic is recommended.

The other results worth noting are the numerous significant differences in perceptions of usefulness that exist between the fully-online post-COVID-19 and the in-person post-COVID-19 courses; all areas of which were rated significantly higher by students in the fully online course than the in-person one with the exception of reinforcing lecture lessons, which was not found to have a significant difference at all. Since all course content and procedures were the same for the fully-online version and the in-person version, with the one major change being delivery of lecture sessions moving from a synchronous video conference session to an in-person classroom, it seems to indicate that future research should focus on this modality change and how it could impact students perceptions of video quizzing as well as other online course content. A third mode, not used in this study but gaining popularity, is a hyflex model in which some students attend in-person, and some attend the same session synchronously through video conference systems. A future extension of this study that includes students attending that model is planned.

While the results of this study seem to offer more questions than they answer, it is no doubt a starting point in which to better understand the impacts that the COVID-19 pandemic, the great pivot, and/or the year of remote learning has had on student perceptions of online content and the value of the in-person experience. Future research should focus on how class modality can impact feeling of being able to skip class sessions as well as gaining a better understanding of the relationship between course modality and perceptions of online content usefulness. Likely pre-COVID-19 studies still have lessons to teach us but re-evaluating these areas in a post-COVID-19 environment may be needed. Areas where no significant difference between pre- and post-COVID-19 rating are also very promising in ensuring that some best practices developed prior to COVID-19 remain at the same level of worth and that not everything will have to go back to the drawing board as a result of the disruptive event which has seemed to change so many other aspects of education, as well as our lives in general.

References

Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. Hillside. NJ: Lawrence Earlbaum Associates.

Cummins, S., Beresford, A. R., & Rice, A. (2016). Investigating engagement with in-video quiz questions in a programming course. Transactions on Learning Technologies, 9(1), 57-65.

Delen, E., Liew, J., & Wilson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video based environments. Computers & Education, 78, 312-320.

Gaynor, J., & Millham, J. (1976). Student performance and evaluation under variant teaching and testing methods in a large college course. Journal of Educational Psychology, 68(3), 312-317. https://doi.org/10.1037/0022-0663.68.3.312

Geri, N., Winer, A., & Zaks, B. (2017). Challenging the six-minute myth of online video lectures: Can interactivity expand the attention span of learners? Online Journal of Applied Knowledge Management, 5(1), 101-111.

Griswold, L. A., Overson, C. E., & Benassi, V. A. (2017). Embedding questions during online lecture capture to promote learning and transfer of knowledge. American Journal of Occupational Therapy, 71(3), doi 7103230010p1-7103230010p7.

Guo, P. J., Kim, J., & Robin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. ACM Conference on Learning at Scale (L@S 2014); found at http://groups.csail.mit.edu/ui....

Kleftodimos, A., & Evangelidis, G. (2018, April). Augmenting educational videos with interactive exercises and knowledge testing games. In 2018 IEEE Global Engineering Education Conference (EDUCON) (pp. 872-877). IEEE.

Lagerstrom, L., Johanes, P., & Ponsukcharoen, M. U. (2015). The myth of the six-minute rule: Student engagement with online videos. Proceedings of the American Society for Engineering Education, June 14-17, 2015, Seattle, WA. Retrieved from https://www.asee.org/public/conferences/56/papers/13527/download

Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43-52.

Rice, P., Beeson, P., & Blackmore-Wright, J. (2019). Evaluating the Impact of a Quiz Question within an Educational Video. TechTrends, 1-11.

Roediger, H.L. & Karpicke, J.D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249-255.

Sumter, L. (2018, October). Exploring the Use of Video Interaction in an Online Environment Using "Kaltura Interactive Quiz". In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 1075-1078). Association for the Advancement of Computing in Education (AACE).

Zaromb, F.M. & Roediger, H.L. (2010). The testing effect in free recall is associated with enhanced organizational processes. Memory & Cognition, 38(8), 995-1008.

Zhang, D., Zhou, L., Briggs, R. O., & Nunamaker Jr, J. F. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management, 43(1), 15-27.