This study attempted to determine if placing videos in an asynchronous course influenced the learning experience. Data were examined for an introductory college statistics course comparing results pre and post implementation of videos in support of discussions, assignments, homework, quizzes, and exams. Frequency of external tutoring was significantly reduced (40%) for the course sections that included embedded videos. This finding supports the idea that videos reduced the amount of friction or extraneous cognitive load experienced by students since using tutoring resources external to the course requires additional time and effort on the part of students and tutors. A significant majority of students agreed or strongly agreed that the videos helped them understand course concepts and how to perform course calculations. A significant majority of students also felt that course videos improved their knowledge of Microsoft Excel. However, there was not enough evidence found to support the idea that videos improved student grades or reduced tutoring time for those students who required it. Recommendations for future research includes repeating this study methodology accounting for gender, age, and ethnicity. Additionally, the methodology used in this study should be used in both traditional and non-traditional college settings.
STEM courses are sometimes seen as barrier classes for students because they have a higher perceived cognitive load. STEM courses typically deal with different types of software to enhance the learning process. However, course concepts can get “lost” for some students who struggle with the tools used to solve problems. Extraneous (unnecessary) cognitive load (also known as “friction”) encountered by students who need to use unfamiliar software leads to course withdraws or less than optimal performance. Online courses have the additional aspect of being asynchronous. One of the challenges posed by distance learning is instructional presence. Videos can be used to address issues posed in STEM courses such as extraneous cognitive load and (lack of) instructional presence by explaining concepts and demonstrating procedures needed for students to be successful. The purpose of this study was to determine if embedded course videos could improve the learning experience in any meaningful way.
Introduction and Literature Review
Statistics is often considered a quantitative STEM barrier course (Currie, 2014), meaning it is required for their degree and cannot be avoided. For example, many undergraduate business degrees require at least one statistics course. Barrier courses (also referred to as gatekeeper or gateway courses) typically experience high rates of failure and/or withdrawal. Because online courses already experience a higher withdrawal rate compared to in person courses (Jaggars et al., 2013), course persistence is a particular concern for barrier courses taught in the online modality. Reducing course attrition may address degree persistence (McKinney et al., 2019).
While there are many complex and overlapping factors that can influence student persistence in an online course, teaching presence is widely regarded as an important variable. There is data to support a positive correlation between teaching presence actions (e.g.; meaningful feedback and quality of faculty-student interactions) and persistence in online courses (Gaytan, 2015; Ivankova & Stick, 2006; Ojokheta, 2010). While the data for online STEM courses is lacking, there is evidence for this same trend for in-person STEM courses (Hegeman, 2015) so it is reasonable to hypothesize this trend for online STEM courses. Furthermore, teaching presence is associated with reduced extraneous cognitive load (Kozan, 2015); cognitive overload in online courses is associated with withdrawal (Tyler-Smith, 2006). Teaching presence is also shown to promote academic performance in online courses (Jaggars & Xu, 2016) and foster positive student perspectives (Orcutt & Dringus, 2017).
Instructor-developed multimedia content (e.g. course videos) can help establish teaching presence, which can increase engagement, promote student cognitive presence, and foster connectedness and community (social presence) (Borup et al., 2012; Dunlap et al., 2016; Seckman, 2018; Young II et al., 2014). Students support the use of instructor-generated course videos (Elliot et al., 2020; Rose, 2009; Valenti et al., 2019). While at face value it may seem that instructor-developed videos only promote passive learning through student-content interaction if the student rewatches sections it becomes active learning. If a student pauses the video and takes notes or otherwise engages with the content, it becomes constructive learning. If there is a collaboration with peers to construct meaning and draw connections from the video, it can then move into interactive learning. In fact, it has been argued that the primary benefit of course videos is the ability for students to control the pace of delivery, allowing for active, constructive, and interactive learning (Hartsell & Yuen, 2006).
While the literature focusing on the impacts of lecture videos in asynchronous online statistics courses is very limited, there is promising supportive evidence. Several studies demonstrated that students do access and use video resources focused on statistics concepts (Aragon, 2016; Dunn et al., 2015; Grant & Oerlemans, 2021) There is even evidence of active and constructive learning with instructor-generated statistics videos (Aragon, 2016; Dunn et al., 2015). Furthermore, a positive correlation has been reported between video use (including lecture videos, tutorial screencasts, and assessment videos) and learning outcomes, with the number of resources used correlating to summative assessment scores (Grant & Oerlemans, 2021). Furthermore, supporting the influence of active engagement, the number of times a resource was accessed correlated to the summative assessment score (Aragon, 2016; Grant & Oerlemans, 2021). While the evidence for an influence between video use and course performance is still tenuous in online undergraduate statistics, this relationship is also seen in other STEM disciplines (Schoenfeld-Tacher & Dorman, 2021), providing further support for a positive relationship.
Video style and quality is an important variable to consider, though. Creating high-quality, impactful instructor-generated course videos is not easy and it is not fast. The videos must not only consider andragogy, but also technological issues and communication skills (Aragon, 2016). Video style influences student satisfaction with videos in online STEM courses (Choe et al., 2019). Video duration is also important for enhancing attention and focus and addressing time poverty (Glance et al., 2013; Valenti et al., 2019). Chunking content and segmenting videos can help reduce cognitive load and increase engagement with the video resources (Altinpulluk et al., 2020). In general, though, best practices for instructor-generated course videos is still being explored by researchers and practitioners (Valenti et al., 2019). Perceptions regarding video “best practices” may vary discipline to discipline and culture to culture (Valenti et al., 2019). Attention should be placed on ensuring the videos promote cognitive engagement as this may be impacted relative to live lecture attendance in some STEM disciplines (Trenholm et al., 2019).
Our research had three main goals.
- Determine if the number of times student seek tutoring is reduced through the use of embedded course videos.
- Determine if embedded videos impact student grades
- Determine if students perceive course videos are helpful to understand course concepts, run calculations and use readily available software such as Microsoft Excel
Ha1. The use of embedded course videos and tutoring frequency are associated.
Ha2. Use of embedded videos in course design will result in higher student grades.
Ha3. Significantly more students will agree than disagree that embedded course videos helped them understand course concepts.
Ha4. Significantly more students will agree than disagree that embedded course videos helped them perform course calculations.
Ha5. Significantly more students will agree than disagree that embedded course videos improved their knowledge of Microsoft Excel.
The research included retrospective and survey methods. We examined dependent variables of tutoring, grades, and survey data. Tutoring data were evaluated by examining 276 course sections of an undergraduate statistics course, comprised of over 4,318 enrolled students over a 26 month period (the latest 7 months the course included embedded videos, the previous 19 months videos were not embedded).
Final course grades for 163 students in six course sections were examined. Three course sections used embedded videos, three sections did not.
Lastly, end of course survey results from three course sections (that used embedded videos) asked specific questions on: 1) if the videos helped students understand course concepts; 2) if the videos helped students with computations and; 3) if the videos improved student knowledge of Microsoft Excel. Faculty and students were unaware that a study was being done.
The Course Used in the Study
The statistics course used in this research was typically the first statistics course students took in their programs. The course was offered with and without embedded videos within the course modules. Both versions of the course used the same course description, learning outcomes and Pearson MyStatLab software. Annual enrollments ranged between 2,000 and 2,500 per year. Terms were 9 weeks long and were taught online. Courses consisted of one learning module per week and were offered (course start dates) every month except for December. This term length is less than the usual 15 to 16 week timeframe found on many college campuses. A typical module would include a discussion, an assignment, one to two homework chapters and a quiz. Discussions and grading feedback as well as announcements were the three primary methods faculty used to communicate with students. Discussions in particular was where the majority of faculty/student communication regarding course concepts was designed to take place. Since this course was foundational to upper level courses in several degree programs, faculty from the College of Business requested that students use Microsoft Excel to complete some learning outcomes.
Students were supported by faculty available through e-mail, Canvas messaging system, phone, and synchronous video meeting option as well as an external tutoring service. The tutoring service had been in place for at least four years prior to the start of data collection period. A concern was the growing number of tutoring sessions for this course and other math courses. The use of videos was one possible solution to students feeling a need to look for help beyond the instructor with their understanding of course materials.
Types of Videos used
Videos were the independent variable in this study. The types of videos used in this study can be broken down into two major categories. The first category is concept videos discussing such ideas as random sampling, the normal curve, or types of data. These concept videos included diagrams to help illustrate abstract concepts. A focus was to keep the language simple and the examples easy to follow so as to build confidence in students that they could understand what was being taught. Most concepts videos made use of the Learning Glass system where the instructor faces the camera while drawing or writing on the glass giving the students the impression the instructor was looking at them. Some concept videos also made use of screen share technology with most of the screen covered by the diagram and a live picture of the instructor typically in a corner of the screen. Fourteen concept videos were locally recorded with professional quality.
The second type of video used can be called “how to” videos where students were shown, step by step, how to run given tests using different types of software such as Excel or StatCrunch. These videos were developed using screen share technology so the student could see what was happening with each step. At the same time, an image of the instructor was also included to establish some instructional presence.
Both types of videos were placed in the specific modules where students would encounter the concept or procedure for the first time. Videos were kept short (2 to 7 minutes) and were designed to focus on one or two major points. Most videos were less than 5 minutes long and had a transcript located below the video so those with hearing disabilities could benefit from what was said. Eighteen “how to” videos were locally recorded using professional quality.
Videos have the advantage of being played at the convenience of the student and can be played back as often as needed. The locally produced videos described in the two above paragraphs were augmented by 22 additional videos sourced from the textbook publisher and world wide web.
Participants and Demographic Data
Data for course sections ranging from January 2020 through February 2022 were examined after courses had ended. Students were typically in their mid-thirties. Approximately half were active, or reserve enlisted military personnel and another 30% were military affiliated to include those veterans on the GI Bill or spouses of military.
Tutoring data were used from two separate periods. The first period was when course did not use embedded videos (January 2020 through July 2021) and the second period was August 2021 through February 2022 when the course used embedded videos.
Grade data were collected on 163 students (six course sections) and of those students, 12% were female male and 88% male. Females tended to have higher final course grades (M = 93.3, SD= 6.3) vs than males (M = 86.6, SD= 15.1), t(52.2), 3.5, p<.001.
Survey data were collected from three sections of the statistics course for the August 2021 and October 2021 terms. Out of 86 students surveyed, 65, provided their perceptions on the usefulness of the videos (75.6% response rate).
Students self-enrolled into course sections, an important distinction with regard to learning styles and preferences. It should also be noted that due to COVID restrictions, course sections offered were exclusively online.
Design Evaluating the Use of Videos
The research design was a combination of post hoc and survey research. The main independent variable was the use of 54 embedded videos within the Canvas course. Thirty one of the videos were produced by the course developer using Learning Glass or screen share technology. Of the 32 videos, 14 discussed course concepts and 18 were “how to” step by step screen shared” instructions on how to calculate statistics such as z or t scores using Microsoft Excel and StatCrunch. Additionally, 11 publisher and 11 web based videos were used to illustrate concepts or examples. These videos were positioned in 7 of 9 modules in a uniform location (the second webpage in each module called “Readings and Resources”). Videos ranged from two to 7 minutes long and were positioned where students would encounter a new concept or would need to perform a new procedure (Truell, 2018). Videos were professionally developed and avoided (just) the use of a “talking head” instead using more visually appealing graphics offered by the Learning Glass or screen share recording tools (Hofer, 2015).
The three dependent variables were: 1) frequency of tutoring; 2) student grades and; 3) student survey responses. The first two dependent variables had pre and post implementation of embedded course videos data. The survey results were from students who were surveyed after videos were embedded in the course.
Research methodology for this study was approved by the Institutional Review Board of the university where study data originated. The research was classified as “Exempt” (#22-075) prior to any data collection efforts.
Treatment of the Data
Evaluation of tutoring frequency was completed using Chi Square comparing Jan 2020 through July 2021(course sections without embedded videos) and Aug 2021 through February 2022 (course sections with embedded videos) data. Chi Square was used due to the nominal nature of the data (α=.05) to determine if embedded videos and tutoring sessions were related.
The impact of course videos on course grades was examined by comparing three sections of the course without embedded videos with three sections of the course that used embedded videos (N=163). A two sample t-test was used to evaluate if the means of the two groups were statistically different (Gay, Mills, & Airasian, 2009; Gould & Ryan, 2012).
End of course survey response data were examined for three sections of the course that used embedded videos. The three questions asked about how helpful the concept, “how to” and Excel videos were to students. A five point Likert scale was used. Answers of “Strongly Agree” and “Agree” were combined into the “Agree” category. Answers of “Neutral”, “Disagree” and “Strongly Disagree” were counted as “Disagree” responses (Gay, Mills, & Airasian, 2009). Sixty five students (75.3% response rate) answered three questions specific to the perceived usefulness of the videos. Data were tested using Chi Square with a Bonferroni corrected alpha of .017 (.05/3) to avoid Type 1 errors regarding the null hypothesis. The Chi Square test was run to see if respondents answered the questions differently from expected responses (Triola, 2013).
A majority of the research hypotheses were supported in this study. In the course sections that used embedded videos, Students tended to request fewer tutoring sessions resulting in a significant (40%) reduction. There was not enough evidence to suggest student grades were positively impacted by the use of embedded course videos. However, a majority of students strongly agreed or agreed that videos: 1) helped them understand course concepts; 2) helped with running statistical tests and; 3) improved their knowledge of Microsoft Excel.
In this study, variables of tutoring frequency and use of course videos were not independent. Course sections without course videos experienced a 23% rate of tutoring sessions (761 tutoring sessions out of 3,335 students). Course sections that used videos had a 14% rate (135 tutoring sessions out of 983 students) which was a 40% reduction. Chi Square analysis resulted in a statistically significant finding, x2(1, 4318) = 38.1, p<.001) supporting the conclusion that videos and tutoring frequency were not independent. The length of tutoring sessions were about 24 minutes for course sections with and without embedded course videos. Course sections with embedded videos had slightly shorter sessions but these differences in tutoring session length were not statistically significant t(8), 0.38, p=.72, (α=.05).
The comparison of student grades (N=163 students) between the three course sections with embedded videos and three course sections without did not show a significant difference in end of course scores. Both the non-video and video groups achieved average final grades in the high 80s separated by two percentage points, t(158.9), 0.89, p=.37). Homework and quiz scores also did not support the idea that embedded course videos improved grades (α=.05).
Respondent Data on Video Usefulness
Student responses on end of course surveys indicated a perception that embedded course videos aided the learning process. Significantly more students agreed than disagreed that course videos helped them understand course concepts x2(1, 65) = 43.2, p<.001). More students agreed than disagreed that course videos helped them run statistical tests x2(1, 65) = 43.2, p<.001). Lastly, more students agreed than disagreed that course videos improved their knowledge of Microsoft Excel x2(1, 65) = 43.2, p<.001, (α=.017).
Open area comments on surveys were provided by 21 of the 65 students who responded. Common themes included the idea that concept and “how to” videos were very helpful. Seven comments were made regarding how the videos helped students learn course concepts. One student summarized this idea by stating: “Very helpful video tool to help understand the concept and course material. Absolutely the best tool”. Another student commented:
“Interactive video lessons were very helpful to understand the course material each week. The best way to understand”. Finally, a student commented “I loved these videos. They helped me greatly”.
Regarding the “how to” videos, eight student comments were noted. One student stated: “Being able to see the exact process to use was a big help. There's nothing better than being able to watch someone do it first, and then apply that process to your own problem.”
One student wrote: “I wasn't aware that Excel could do statistical calculations, but now I am!” Another commented: “videos on how to use "StatCrunch" are very insightful and useful”.
Six students made comments regarding the embedded videos demonstrating the use of Excel, which was requested by the College of Business. A student commented: “Excel seemed like a better tool to use if you needed to eventually present the information, whereas StatCrunch seemed like a good tool in general for statistical calculation”. Another student stated: “This class DEFINITELY taught me new ways to use excel and I will continue to use them in the future”. One student wrote: “I have learned a few new things, for example how to do SD (Standard Deviation), mean, Z score, and other stuff”. Finally, one student summed up their impressions by writing: “It (videos) helps a lot, especially the ones with EXCEL (sic)”.
Student tutoring frequency decreased by 40% when courses included embedded videos. This statistically significant finding supports the idea that embedded videos can lessen the need for students to seek tutoring external to the support offered from the instructor in a course. It follows that if students don’t feel the need to seek outside help while taking a class, their path to learning course concepts has less friction or extraneous cognitive load. Less friction enables student to persist in completing the course.
There was no significant difference in end of course grades when comparing course sections with and without embedded videos. Students appeared to learn course concepts even when embedded videos were not present. Withdraw rates were approximately 4% for both groups, a non-significant difference.
A significant majority of students who took course sections that used embedded videos believed the videos helped them understand course concepts, how to calculate statistical tests and how to use Microsoft Excel. Open area student comments summarized student impressions regarding their positive impact on learning course concepts, how to run statistical tests and how to use Microsoft Excel. These results support the idea that videos reduced course friction and, at least in the minds of students who were surveyed, enabled completion of the STEM (statistics) course.
Our study used both post hoc and survey data to determine if course videos enhanced the student learning experience in any meaningful way. Course grades were not significantly different between course sections with and without embedded videos; results that were similar with those reported by Muller (2017). Additionally, as mentioned in the discussion, withdraw rates were approximately 4% for both groups, a non-significant difference. These results alone did not support the idea that imbedded videos enhanced student performance.
However, course sections with videos appeared to provide students with less friction or extraneous cognitive load while learning how to use unfamiliar software to complete unfamiliar activities. Friction reduction as measured by frequency of tutoring and student perception has a positive impact on persistence. Reducing tutoring frequency allowed students more time to focus on what they needed to learn within the course, while at the same time, decreased insecurities they might have had about the subject matter. This idea was supported by student perceptions indicating that most students believed the concept, “how to”, and Excel videos helped them learn the software and complete course tasks.
Videos also aid in instructional presence. Students need to believe that an instructor cares about their learning experience. The “how to” videos were designed to take the mystery of how to uses software in specific calculations and interpret the results. Many students are not (just) verbal learners but need to see examples of how to do things as part of their learning process. Students also had the option of playing back the videos as many times as required to learn the concept or task without embarrassment (Kahn, 2013). Once students see how a process is done and how to interpret the results, course concepts are reinforced and fear (of not being able to grasp a course concept or successfully run a test) is reduced or eliminated.
STEM courses such as math and statistics are typically viewed as “barriers” to completing a degree. Videos are an important tool that not only provides instructional presence but reduces the extraneous cognitive load (friction) to learn (what many find) a difficult subject.
An obvious limitation of this study is term length. The course used in this research was 9 weeks long. Nine week terms are shorter than the more typical 15 or 16 week semester found at many universities. Our research did not examine if term length is a factor with regard to the use of videos and their impact on student learning. Our results may not be applicable to courses offered in more traditional term lengths.
Course sections used in this research were online. The impact of videos on student learning was not assessed in face to face sections. It is possible that face to face course sections would offer similar benefits to online courses with embedded videos, however that analysis was not part of this research.
Although student gender was discussed with regard to end of course grades, student demographics such as gender race and age were not evaluated beyond that aspect. Future researchers should examine demographic data to compare and contrast student performance and perception.
The university used for this research had an exemplary record of teaching military or military affiliated students typically ranking in the top three in annual polls (U.S. News, 2022). The focus on military students is due to the fact that the distance learning arm of the university started on U.S. military bases. 50% of the student body were active duty military and an additional 30% were military affiliated such as veterans and military spouses. The population was mostly male. These population demographics do not match what is found on a typical university campus where female students out number males. Additionally, most students in this research were working adults. The majority of the student body at many universities is typically between the ages of 18-22. The students in this study however were typically in their mid-thirties. In many cases, the students in this research had not taken a statistics course in years and could possibly need more scaffolding than recent high school graduates. For this reason, care should be taken on projecting our results onto typical campus settings with more traditional student demographics.
In the United States (as well as most of the world) COVID impacted college students in unique ways including the reduction or elimination of face to face sections offered in lieu of more online sections. Additionally, hardships may have impacted students such as a job loss, working from home for the first time or family sickness, that were not a normal occurrence. This added stress could have impacted results in ways not measured in this research.
Caution should be taken when examining the student perception findings in this study. It is best to view student perceptions presented here as the views of those who participated in the survey. Although these results support certain ideas, replication in future research should be accomplished to determine if trends exist.
The methodology in this research should be replicated in traditional and non-traditional college and university settings.
Future researchers should consider investigating the efficacy of imbedded course videos in course sections using different term lengths.
An examination of student demographics such as race, gender, military affiliation, should be a key part of any replication
Researchers should consider a comparison of course sections with imbedded videos and traditional face to face courses to determine if there are any differences in grades, tutoring sessions, or student perceptions with regard to course concepts, “how to” demonstrations, or improvements in student knowledge of Excel.
We also noted that, in this study, withdraw rates were similar between classes with and without imbedded videos. Future longer term studies on the effectiveness of imbedded course videos should compare withdraw rates.
Any student survey data collected in future research should be regarded as the perceptions of those who chose to complete the survey. Although trends can occur, researchers should be careful in characterizing survey results on larger populations.
An examination of the impact of embedded videos on discussion boards is also a critical area for future researchers since this aspect of online courses is where most interaction takes place.
As noted in the Methodology section of this study, data collection and analysis used in this research was approved by the Institutional Review Board prior to commencement. We have no conflicts of interest to disclose.
Altinpulluk, H., Kilinc, H., Firat, M., & Yumurtaci, O. (2020). The influence of segmented and complete educational videos on the cognitive load, satisfaction, engagement, and academic achievement levels of learners. Journal of Computers in Education, 7(2), 155–182. https://doi.org/10.1007/s40692...
Aragon, R. (2016). What has an impact on grades? Instructor-made videos, Communication, and Timing in an Online Statistics Course. Journal of Humanistic Mathematics, 6(2), 84–95. https://doi.org/10.5642/jhumma...
Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3), 195–203. https://doi.org/10.1016/j.ihed...
Choe, R. C., Scuric, Z., Eshkol, E., Cruser, S., Arndt, A., Cox, R., Toma, S. P., Shapiro, C., Levis-Fitzgerald, M., Barnes, G., & Crosbie, R. H. (2019). Student satisfaction and learning outcomes in asynchronous online lecture videos. CBE Life Sciences Education, 18(4), ar55–ar55. https://doi.org/10.1187/cbe.18...
Currie, L. W. (2014). Mathematics anxiety in adult undergraduate business students: A descriptive study [Ph.D., Capella University]. In ProQuest Dissertations and Theses. https://www.proquest.com/docvi...
Dunlap, J. C., Verma, G., & Johnson, H. L. (2016). Presence+experience: A framework for the purposeful design of presence in online courses. TechTrends, 60(2), 145–151. https://doi.org/10.1007/s11528...
Dunn, P. K., McDonald, C., & Loch, B. (2015). StatsCasts: Screencasts for complementing lectures in statistics classes. International Journal of Mathematical Education in Science and Technology, 46(4), 521–532. https://doi.org/10.1080/002073...
Elliot, L., Gehret, A., Valadez, M. S., Carpenter, R., & Bryant, L. (2020). Supporting autonomous learning skills in developmental mathematics courses with asynchronous online resources. American Behavioral Scientist, 64(7), 1012–1030. https://doi.org/10.1177/000276...
Gay, L. R., Mills, G. E., & Airasian, P. W. (2009). Educational research: Competencies for analysis and application (9th ed.). Pearson.
Gaytan, J. (2015). Comparing faculty and student perceptions regarding factors that affect student retention in online education. American Journal of Distance Education, 29(1), 56–66. https://doi.org/10.1080/089236...
Glance, D. G., Forsey, M., & Riley, M. (2013). The pedagogical foundations of massive open online courses. First Monday. https://doi.org/10.5210/fm.v18...
Gould, R. N., & Ryan, C. N. (2012). Introductory statistics. Pearson.
Grant, J. B., & Oerlemans, K. (2021). Supporting online learners in psychology: An analysis of the use of videos in an undergraduate statistics course. In J. Hoffman & P. Blessinger (Eds.), International Perspectives in Online Instruction (Vol. 40, pp. 9–24). Emerald Publishing Limited. https://doi.org/10.1108/S2055-...
Hartsell, T., & Yuen, S. C.-Y. (2006). Video streaming in online learning. AACE Review (Formerly AACE Journal), 14(1), 31–43.
Hegeman, J. S. (2015). Using instructor-generated video lectures in online mathematics courses improves student learning. Online Learning, 19(3), 70–87.
Hofer, M. (2015). 5 tips for creating inspiring course trailers. Retrieved from http://www.luminaris.link/blog/5-tips-for-creating-inspiring-course-trailers
Ivankova, N. V., & Stick, S. L. (2006). Students’ persistence in a distributed doctoral program in educational leadership in higher education: A mixed methods study. Research in Higher Education, 48(1), 93–135. https://doi.org/10.1007/s11162...
Jaggars, S. S., Edgecombe, N., & Stacey, G. W. (2013). What we know about online course outcomes. In Online Education and Instructional Technology. Community College Research Center. http://ccrc.tc.columbia.edu/pu...
Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers & Education, 95, 270–284. https://doi.org/10.1016/j.comp...
Khan, S. (16 March 2013). Let’s use video to reinvent education. TED Talk. https://youtu.be/DC58z4N0IWw
Kozan, K. (2015). The predictive power of the presences on cognitive load: Vol. Ph.D. https://docs.lib.purdue.edu/op...
McKinney, L., Novak, H., Hagedorn, L. S., & Luna-Torres, M. (2019). Giving up on a course: An analysis of course dropping behaviors among community college students. Research in Higher Education, 60(2), 184–202. https://doi.org/10.1007/s11162...
Muller, D. (17 March 2011). Khan academy and the effectiveness of science videos. Khan Academy Video. https://www.youtube.com/watch?v=eVtCO84MDj8&feature=youtu.be
Ojokheta, K. (2010). A path-analytic study of some correlates predicting persistence and student’s success in distance education in Nigeria. Turkish Online Journal of Distance Education, 11(1), 181–192.
Orcutt, J. M., & Dringus, L. P. (2017). Beyond being there: Practices that establish presence, engage students and influence intellectual curiosity in a structured online learning environment. Online Learning, 21(3), 15–35.
Rose, K. K. (2009). Student perceptions of the use of instructor-made videos in online and face-to-face classes. Merlot Journal of Online Learning and Teaching 5(3), 9.
Schoenfeld-Tacher, R. M., & Dorman, D. C. (2021). Effect of delivery format on student outcomes and perceptions of a veterinary medicine course: Synchronous versus asynchronous learning. Veterinary Sciences, 8(2), 13. https://doi.org/10.3390/vetsci...
Seckman, C. (2018). Impact of interactive video communication versus text-based feedback on teaching, social, and cognitive presence in online learning communities. Nurse Educator, 43(1), 18–22. https://doi.org/10.1097/NNE.00...
Trenholm, S., Hajek, B., Robinson, C. L., Chinnappan, M., Albrecht, A., & Ashman, H. (2019). Investigating undergraduate mathematics learners’ cognitive engagement with recorded lecture videos. International Journal of Mathematical Education in Science and Technology, 50(1), 3–24. https://doi.org/10.1080/002073...
Tyler-Smith, K. (2006). Early attrition among first time eLearners: A review of factors that contribute to drop-out, withdrawal, and non-completion rates of adult learners undertaking eLearning programmes. Journal of Online Learning and Teaching, 2(2), 73–85.
Triola, M. (2013). Statdisk. Pearson Education Inc. Retrieved from http://www.statdisk.org/
Truell, M. (2018). Best practices: Creating video course trailers. Retrieved from https://admin.trinity.duke.edu/communications/best-practices-creating-video-course-trailers
U.S. News and World Report (2022). Best online bachelor’s program. https://www.usnews.com/education/online-education/compare?program_type=bachelors&usnews_id=133553
Valenti, E., Feldbush, T., & Mandernach, J. (2019). Comparison of faculty and student perceptions of videos in the online classroom. Journal of University Teaching and Learning Practice, 16(3), 71–92. https://doi.org/10.53761/1.16....
Young II, W., Hicks, B. H., Villa-Lobos, D., & Franklin, T. J. (2014). Using student feedback and professor-developed multimedia to improve instructor presence and student learning. Journal of Teaching and Learning with Technology, 12–30. https://doi.org/10.14434/jotlt...