Abstract

The wash-rinse-repeat model of refreshing courses each term might be a financially beneficial model for the institution, yet is it serving students the best educational experience? If all that is monitored in the pre-flight check is for broken hyperlinks and refreshed dates, who is ensuring the quality delivery? Many institutions proudly boast “excellence in teaching”, “student success”, “engaged learning”, and “student-centered” in mission, vision, and value statements, yet once an online course is launched, the oversight of that course often appears to go on autopilot in some institutions for years.

Course design is an iterative process with feedforward loops like student course reviews, peer observations, course grades, and self-reflections to name a few. The process might involve a SWOT-style analysis including course challenges students identify, opportunities for improvement from peers, and reflection on course goals to capture all input. Instructional designers are key stakeholders in the process for information about campus resources, educational technology, and accessing reports from the learning management system from previous courses to study student performance or course page usage.

Introduction

Prior to each term, hundreds of courses are loaded into college learning management systems (LMS) from master files ready for “pre-flight checks.” A rundown of a checklist ensures that links still work, assignment and exam due dates are refreshed for the new term, plug in the updated syllabus and it’s published without much work. It is a scalable model that works time and again for a higher education system that favors efficiency, effectiveness, and scalability. Wash. Rinse. Repeat.

What is missing is the human intervention and iterative process that learns. While it is more cost effective to build a master course shell that can be reloaded time after time, is that the best model for our students? When a course “runs itself,” such as a self-sustained course, when is it quality reviewed? Does it serve our students’ career needs if the course is not routinely updated? “The adoption of best practice and standards for online courses helps to create a culture of intentionality with carefully constructed learning outcomes connected to engaged learning materials, systematic procedures and processes used throughout an online course’s life-cycle, and an overall focus on quality leading to ongoing evaluation and revision of online courses” (Martin et al., p. 35, 2019).

Feedback generates a reactive and often abrasive concept in the human mind (Boud & Molloy, 2013). These are reactions to past experiences; often couched in the form of constructive criticism. The role of feedback is vital to improvement for both students and faculty. Yet the process should be viewed as a dialogue or loops instead of “ping pong” as described by Askew and Lodge (2000). Creating the idea that incoming data is applied, rejected, or modified for course improvement is critical.

The term feedforward was specifically chosen for use in the title of this manuscript for its positive nature. In his book on feedforward, Hirsch (2017) describes six characteristics that make the process effective. The acronym REPAIR is provided as a scaffold: regenerates, expands, particular, authentic, impact, and refines.

Hirsch describes regeneration as promoting growth in areas of strength. For example, a faculty member is encouraged to share their successes in the online classroom either as a training for their peers or as a publication. Expansion is identified as noting areas for additions, not the negativity of feedback where problems are pointed out. Particular provides a faculty member with focused and ongoing support instead of feedback rubrics with complicated measurements dumping scads of information. Authenticity describes the problem and the impact it has on the learner whereas feedback often creates a “praise sandwich” to hide criticism. Impact is the plan for step-by-step change; feedback is focused on the data instead of the improvement. Refinement comes from various people with different viewpoints and skill sets instead of a top down management structure often associated with feedback.

This article seeks to harness the sources of information, or existing methods, for redesigning existing online courses prior to relaunch each term. With the buzzwords of student success, engaged learning, and student-centered classrooms proliferating higher education mission statements or value statements, there must be a lot of people interested in supporting excellent teaching!

The ugly truth is that the “wash-rinse-repeat” model is cheaper. Universities invest a lot of resources into developing online courses. Systematically reevaluating them after each term is not cost effective. Yet it is the right thing to do for the students’ education and therefore we must ensure it is completed.

Literature Review

The triggering event for the course redesign can be defined by a certifying body, the institution, or by instructor. Some institutions might offer an online course for a set number of iterations before it is triggered for review. Other institutions may allow faculty to review and update the course as needed without a formal re-review period scheduled. Then there are course quality standards, such as Quality Matters, that trigger reviews of their certified courses every five years.

Continuous improvement is a common ongoing process that is used to improve courses, programs, and institutions within higher education. The goal of a course redesign: “What will be the advantages (for both students and faculty) of the redesign?” (Vaughan, p. 62, 2010). If the course instructor was not particularly well trained, this might be an opportunity to offer them development opportunities in active learning and/or use of engagement tools.

When educators across the planet were forced into the online classroom, some probably learned the hard lesson that lectures cannot become podcasts and assessments cannot all become multiple choice exams delivered asynchronously to a student. As Keast (2020) shares, “something is lost in the translation to the online environment. An award-winning face-to-face course is stunning for a reason: it has connections among students and student to professor” (Keast, p. 1, 2020). Emulating those courses in the online classroom necessitate changes to the course. “Transforming the higher education classroom into an engaging student-centered learning environment is one way to enhance the higher order cognitive skills required for success after college (Arum & Roksa, 2010). We still ask the questions and include discussions. However, we are more strategic about the use of the questions, assessments, and sequencing of learning activities.

Each institution of higher education has excellent resources for instructors to transition to online learning. Orlando (2019) discusses the shift from course developer to content curator. “A faculty members’ value is being able to identify the best content available to present in a way that produces understanding” (p.18). A great source for the faculty member might be a librarian or other media specialist to locate resources.

There are plenty of obstacles for instructors to mitigate while creating or updating an online course. Cho and Berge (2002) name ten:

1. Technical expertise
2. Administrative structure
3. Evaluation/effectiveness
4. Organizational change
5. Social interaction and quality
6. [Feeling] threatened by technology
7. Access
8. Faculty Compensation and Time
9. Student support services
10. Legal issues

While many institutions offer robust training for faculty to teach online, Cho and Berge’s list creates a daunting instructional task for online learning centers.

Instructors can do their best to learn the technical skills, work on creating the meaningful online social interactions, reducing their own (and their students’) feelings of being threatened by the technology, and to work through the access issues as best as possible. However, the instructional designer or digital learning office should be a hub of resources that exemplifies the list crafted by Cho and Berge.

The California State University launched a Course Redesign with Technology (CRT) program in 2013 aimed at spurring innovation in teaching and learning across its 23 campuses. Faculty applications into the program led to attending a five day summer institute to launch a course redesign project. Attendees were introduced to pedagogical models and technologies, previous CRT participants’ redesign projects, active learning principles, universal design, inclusive teaching, open source content, and developing student learning outcomes. In their 2019 report, Fernandes et al. include narratives from the participants’ self-evaluations such as “this project then, is as much about redesigning me, as it is redesigning my class” (p. 39). The instructors used many of the tools provided during the training, as well as studying the performance of previous students to anticipate the learning needs of future students. As Brené Brown (2011) said they had to “embrace the suck…by walking through vulnerability to get to courage” to make necessary changes for student success.

A study of pre-designed courses by Smaldino and Yamagata-Lynch (2015) found that instructors desired the ability to customize courses for students’ learning needs and their teaching styles. The participants suggested that instructors be provided with the ability to customize more aspects of a pre-designed course, provided with both good and poor examples of student assessments, and offered more resources while teaching online courses so they felt more connected to the purpose of the course within the overall degree program.

The model introduced by Stanton and Bradley (2013) claims to be the only literature offering a process and that appears to be true. Their six-steps are:

  1. Identify issues and successes from the assessment results.
  2. Analyse issues to identify root-cause problems.
  3. College and brainstorm solutions to root-cause problems.
  4. Select solution(s) to implement.
  5. Re-organise course from scratch or modify previous offering.
  6. Re-assess course at next offering.

The researchers are engineers and operate with a solution-based mindset. While their model provides structure to the process, it lacks clarity for which assessments and where to obtain them.

A 2005 experiment at Elon University utilized student membership on a course redesign task force. Students were given equal footing with instructors on the committee after the initial dominating power structure was dismantled. There were challenges to the process, and once instructors and students realized they were both ready to give input, and listen to others’ input, the work moved along. Rob Kelly (2011) advised to be humble about the outcomes expected, and to anticipate frustrations and power struggles when students collaborate with faculty.

Information for course redesign can be found in previous course grades and assessments, come from working with instructional designers, conducting a SWOT analysis, analyzing the course and module goals/learning objectives, undergoing a peer observation, soliciting advice from current students for future students, and from the student course evaluations. “Assessment information from online courses can assist the faculty in making decisions about students’ attaining the learning outcomes, diagnosing problems with student learning in specific areas, providing targeted feedback or additional scaffolding to students, and making summative judgements pertaining to grades or retention” (Martin et al, p. 36, 2019). A combination of these data will inform a robust course revision.

In this study, I have developed a provisional model for reviewing courses before relaunch each term to ensure that students are at the center of our pedagogy, not only in our terminology, yet also in our actions. The flexible model includes opportunities for multiple inputs into the review process when available.

Methodology

The redesign-relaunch process should include a variety of input from former students, colleagues, instructional designers, librarians, student services, and others. The multiple sources provide key data to continually improve the course activities, delivery of content, and meeting the diverse needs of students in the course. This is the first application of Hirsch (2017); refinement is the use from multiple sources, not simply top down mandates.

Previous grades and assessments in the course

The learning management system (LMS) is likely to hold a wealth of information about the actions of previous students. Most every action taken by a student in the online course creates a digital event in the LMS. Think of the LMS as “large repositories of data that can be used to inform practice” (Gazza, 2015, p. 291). Create reports each semester informing the instructor about the following data:
1) Which pages in the course had the highest number of views?
2) What was the average time students watched videos?
3) How much time did students spend on various pages such as modules and assignments?
4) How many times did students log in per week?
5) Which assessments garnered the lowest grade and widest range of scores? Could you
scaffold them?
6) Did a specific assessment generate more student questions? How could the directions,
or the assessment itself, be altered for clarity?

These analyses of the data are application of Hirsch’s (2017) authenticity principle as it describes the effect of the problem on the students.

Is the workload evenly spread over the term? Consider using a course workload estimator like the free online version offered by the Center for Teaching Excellence at Rice University (www.cte.rice.edu/workload). The balance will provide students a “rhythm” to the course instead of the uneasy feeling that they have forgotten something.

Something that motivates students is the authenticity of an assessment. Authentic is when “students engage in tasks that discipline specialists perform in order to understand the concepts of the discipline” (Edgerton, 2001, p. 32). These assessments are often complex and offer students a real world application of the lecture and reading. From a student’s perspective, they see the content as relatable, viable, and therefore relevant. A challenge for the instructor is to provide a grading rubric with descriptive anchors for full, partial, no points. A couple real world examples might help define the target for students. When students present their final products, can their work be shown in a discussion forum for their peers as a “gallery walk” to include a discussion of the innovations and ideas?

The old adage that we are too close to something to see the obvious is often a truth. Assisting a faculty member with a few ideas of what to look for in the LMS reports, how to generate reports, and what the data might infer, is important to consider. The conversation with the instructor might trigger a memory about a challenge students had on an assessment, discussion, or sequencing of material.

Instructional designers

Instructional designers are essential to the redesign and relaunch process. While in times past, faculty might have taken issue with input from non-faculty into design of the course, that is not the case now! “Many faculty members were comfortable sharing, discussing, and debating course redesign concepts but often a greater effort was required to transfer these new ideas into practice” (Vaughan, 2010, p. 62). Supporting instructors is the application of Hirsch’s (2017) concept of particular from his REPAIR scaffold mentioned earlier.

Instructional designers are likely to know about the newest education technology tools available, teaching practices by other faculty that might benefit students, helpful templates for university student resources, and able to assist with accessibility, inclusivity, and universal design in course structure and layout. Banner and Magruder (2022) concluded that additional collaboration is needed in the consultation process with other staff such as course support specialists and multimedia technicians. The work enabled instructors to be more successful in the implementation of tools for the relaunch process.

Instructors can easily utilize a few tools such as the Microsoft Accessibility Check under “Inspect Document” in the Info menu of the File tab. Other quick implementations would be the use of institutional resources for students such as academic tutoring, the 24/7 technology help desk, the online writing lab, veterans’ services, accessibility services, advising office, peer mentoring, food pantry, financial aid, link to the library, mental health services, bookstore, testing services, career services, registrar, accounting/bursar, and whatever services the students might need to succeed in a course. The instructional design team will often have an existing webpage with these services that instructors can insert or link to from a course.

Automated tools working in the background can aid both students and faculty. One such tool, such as Dropout Detective, is set up with parameters in the beginning of the course using risk indicators. The tool analyzes students’ behavior in the course and works with time stamps and the gradebook to help identify students at risk in the course. The level of risk can be set by the instructor, as well as the various metrics for the tool to measure such as last login, late/no assignment submission, current grade, and more. An automated message can be sent to both student and instructor once a threshold is met. Tools like Dropout Detective are but one piece of the solution to an evolving landscape of student retention in online education. The instructional design staff at the institution is often the best informed of tools available for instructors to use for engaging students, student collaboration, and active learning.

The instructional designer is also the quality control for online teaching and learning. They often review courses using a rubric such as Quality Matters (https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric), OSCQR (https://oscqr.suny.edu/), the Online Learning Consortium’s Quality Scorecard (https://onlinelearningconsortium.org/consult/olc-quality-scorecard-suite/), or the Indicators of Engaged Learning Online rubric studied by Bigatel and Edel-Malizia (2018). Each rubric has its particular merits and many institutions implement one for quality review before a course is offered. McGahan et al. (2015) developed their own course evaluation tool, yet recommend most universities should use a pre-built instrument of their choice by reviewing several options, testing each option on courses with various reviewers, and then evaluating each tool on their strengths and weaknesses before adoption.

SWOT analysis

I borrowed the SWOT or TOWS technique from the business field. Essentially, four easels are placed around the room with large paper and markers on each easel. At the top of each paper are written one of the following words: strengths, weaknesses, opportunities, or threats. Participants in the activity either collaborate to bounce ideas off one another, or individually walk around adding contributions to the four themes about the company.

The adaptation of this for a course could be similarly applied using a GoogleDoc, or a survey near the end of the course to gather specific data on the strengths, weaknesses, opportunities, and threats/challenges. The activity should be more open-ended than a typical course evaluation and the survey can be done anonymously in most LMSs.

To help students understand the purpose of the various quadrants, perhaps craft a meaningful anchor instead of the singular word.

What things does this course do well?
What makes this course better than other online courses?
What things does this course lack?
What are the limitations of this course due to resources?
What topics need to be added or removed from the course?
What was your attitude toward this topic at the beginning vs. now?

The information is in a different form of data than some found in other sources and provides opportunities for growth by the instructor and course improvement depending on the nature of the questions used with the students.

Analyzing the course and module goals/learning objectives

I will take a moment to figuratively put on my Quality Matters hat and remind the reader that good practice begins with backward design. The course objectives, module objectives, learning activities, and assessments fall into alignment when purposefully designed in reverse order. The common location for this information is on a course map or chart of some kind showing the organization of objectives to activities and readings/materials, and assessments.

Many faculty write their assessments and course objectives, then work to find instructional material to teach the content and activities that lead to the creation of assessments and ability to meet the learning objectives. In the process, instructors forget a few key steps:

  1. Does the assessment measure the module learning objective? If not, then consider re-writing the module learning objective or the assessment to gain congruence.
  2. Is the module learning objective actually the learning activity? For instance, the learning objective is “students recite the main events of the U.S. Civil War.” The activity would be “students role play to learn about main events in the U.S. Civil War.”
  3. The assessment, activity, course objective, and module objective should be at, or near, the same level within Bloom’s Taxonomy. When writing module level objectives, the verbs should be at, or below, the course level verb’s location on Bloom’s Taxonomy. The module objectives lead up to and support the course objectives. Help instructors to craft clear, observable, concise, and measurable learning objectives that align closely within the Bloom’s Taxonomy.

Are there hard scaffolds in the course (pre-placed optional aids for students to use if needed such as APA or MLA style guides)? Can soft scaffolds be added in the course (points where if a student is struggling they can reach out to a research librarian, the course instructor, or a technology specialist on campus for specific help?) Consider adding a column to the previously mentioned course design map for scaffolding so that these become a consideration instead of an afterthought.

Does the course utilize active learning principles or are students passive recipients of knowledge? Are the students cognitively engaged in the course? Are the students behaviorally engaged with the course? Are the students emotionally engaged in the course and with other learners? Asking these questions during the objectives, activities, and assessments step is a perfect opportunity to integrate active learning principles.

By examining the course and module objectives, the opportunity arises to search for the presence of three communities of inquiry: social, teaching, and cognitive. The social and teaching presence is perhaps more obvious in the activities and assessments, while the cognitive presence is more visible in the learning objectives/goals for the course. Checking for the quality of strong learning objectives is critical for the cognitive presence that is closely linked to the concept of critical thinking (Vaughan, 2010, p. 61).

Martin et al. (2019) used a series of 14 questions to facilitate interaction with award winning online instructors (N = 8). Four of the questions course designers ask instructors are provided with the focus area:

Design - How do you design your course?
Assessment - How do you assess you students?
Facilitation - How do you teach the course?
Evaluation - How do you evaluate whether your course is meeting your intended outcomes?

The questions helped Martin et al. (2019) to conclude that organizing by weeks/modules and using backwards design was a common recommendation by all interviewees. In terms of assessing students, the eight instructors suggested using a variety of assessments, providing meaningful and timely feedback, traditional and authentic assessments, constructive in nature mimicking real world tasks, and having weekly assignments such as projects, portfolios, self-assessments, peer evaluations, discussions, reflections, and quizzes with immediate feedback. Their work continued to recommend videos to enhance learner attention, recall, and perceived learning.

Colleague or peer observation

Ask a colleague to review an empty shell of the course. “Award-winning online faculty are an untapped source of useful insights and practices on how to best design and deliver effective online courses” (Martin et al., p. 34, 2019). Search across the institution for champions of online teaching and have a few ready to recommend to faculty. This is the regeneration application of Hirsch’s (2017) model of feedforward.

There are many forms and models of how an observation can look, yet it is often up to the course instructor. One model from Penn State looks at many aspects of the course. The Penn State form is located online at https://facdev.e-education.psu.edu/evaluate-revise/peerreviewonline. The strength of this form is how it is guides the reviewer through the process: what are good traits in online courses, provides short examples of what it might look like in the course, and where evidence might be found in the course. There is also hyperlinked research and resources for the reviewer to use as suggestions for improvement back to the faculty member.

I created an evaluation form utilizing elements of the Penn State model, pulling ideas from the Quality Matters rubric, and organizing the form around the policies, course set-up and organization (landing page), course set-up and organization (classroom set-up), student-faculty interaction (climate and community building), student-faculty interaction (discussion thread interactions), student-faculty interactions (organization), student-student interactions, and student-content interaction (assessment and activities). Similar to the Penn State model, my concept utilizes a simple yes/no column along with a column for comments and suggestions. It is available for use at https://tinyurl.com/5cezsfn6.

In this post-pandemic/endemic world, consider asking reviewers to address if the instructor built a community of learners so students feel a sense of belonging according to Maslow. Is there safety and security? How is that guaranteed or communicated? Where do students find a welcoming and encouraging sense of belonging in the course? Is there space for students to discuss off topic material such as a “Coffee Break” discussion forum? Students need a space to talk to their peers about non-course related material. Even now when lockdowns are not as prevalent, some students remain hesitant to venture out into their community beyond the necessity of groceries and medical appointments. Allowing social interaction to occur remains an important part of the online course.

Advice from previous students

In a recent course review for Quality Matters, I came across a particularly interesting component within a “Start Here” module. It was advice from previous students to the current students starting the course. The course instructor had received the advice from the outgoing students in previous semesters during an optional anonymous course survey administered by the LMS before finals week. He crafted a couple key questions to generate their responses:
1) What I suggest you do to succeed in the course…
2) What I would do differently if I took this course over…
3) The thing that helped me succeed was…

The course instructor curated the list of ideas into the component for the next semester’s students, as well as keeping previous students’ suggestions. It was a running “wiki-page” for students beginning the course. This addition to the course is the application of Hirsch’s (2017) expansion principle.

Student course evaluations

Students are best at telling it like it is. This is actual feedback, not feedforward as this manuscript title suggests, yet this information is incredibly helpful. “The best measure of goal achievement is the response from student learners who actually engage in the course” (Gazza, 2015, p. 292). Morris et al. (2014) concluded that students were reliable evaluators of an instructor’s engagement in the online course. Besides the instructor, the students spend the most time inside the course. Their knowledge of how the course worked, and how it didn’t work, are essential to its improvement.

Who on campus is responsible for releasing the student course evaluations? Are faculty allowed to author special questions for their courses? If so, consider asking to include specific questions that will help generate more helpful information about the course.

1) What aspects of the course met their expectations?
2) Which aspects fell short of their needs?
3) What are those needs that were not met?
4) Which assessments were most helpful to attaining the course objectives?
5) Which assessments were not helpful?
6) What aspects of the course were most confusing and should be clarified?
7) What did you enjoy most about the course?

Students are used to receiving feedback from instructors on assessments in a course. So, instructors should be open to the comments from students to improve a course. Hounsell (2007) identified four characteristics of sustainable feedback for students:

1) involving students in dialogues about learning which raise their awareness of
quality performance;
2) facilitating feedback processes through which students are stimulated to
develop capacities in monitoring their own learning;
3) enhancing student capacities for ongoing lifelong learning by supporting
student development of skills for goal setting and planning their learning; and
4) designing assessment tasks to facilitate student engagement over time in which
feedback from varied sources is generated, processed and used to enhance
performance on multiple stages of assignments.

The characteristics help to shape a culture toward assessment that embraces it instead of resists it. The instructor and students work to improve courses alongside of many others to create a better course for everyone.

Conclusion

There are multiple channels of data that could inform course redesign before relaunch each term. This manuscript identified seven sources, yet there are likely others. As technology advances and the science continues to support student-centered learning, we must engage in continuous improvement with online courses prior to relaunch each term. For Apple to create a new iPhone for the market and then not updating it to keep it safe, secure, or working properly is unthinkable. The same thinking applies to our best online courses. We invest immense resources into their creation, quality reviews, and pre-launch checks only to use the course time and again with modest thought to the user experience until a preset time triggers a redesign process. Let’s commit to our students that they are worth the investment of more than just money, but our time to ensure the course is up-to-date, checking for seven sources of information mentioned in this manuscript, and keep publishing about our research and innovations!

References

Arum, R. & Roksa, J. (2010). Academically Adrift: Limited Learning on College Campuses. University of Chicago Press.

Askew, S. & Lodge, C. (2000). “Gifts, Ping-Pong and Loops – Linking Feedback and Learning.” In Feedback for Learning. (Ed. Susan Askew). RoutledgeFalmer.

Banner, D. & Magruder, O. (2022, March 2). Course design innovation, iteration, evaluation: It’s an evolution, baby [Conference presentation]. TOPkit Workshop 2022, virtual. https://app.socio.events/MTI3N...

Bigatel, P. M., & Edel-Malizia, S. (2018). Using the “Indicators of Engaged Learning Online” framework to evaluate online course quality. TechTrends, 62, 58-70. www.doi.org/10.1007/s11528-017-0239-4

Boud, D. & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge of design. Assessment & Evaluation in Higher Education, 38(6), 698-712. www.doi.org/10.1080/02602938.2012.691462

Cho, S. K., and Berge, Z. L. (2002). Overcoming barriers to distance training and education. Education at a Distance - USDLA Journal, 16(1). https://www.learntechlib.org/p/93565/

Edgerton, R. (2001). Education White Paper. Report prepared for the Pew Charitable Trusts, Pew Forum on Undergraduate Learning. Washington, D.C.

Fernandes, K., Christie, B., Bayard, J., & Kennedy, L. (2019). Large-scale course redesign: Putting reflection into action. Change: The Magazine of Higher Learning, 51(3), 34-43. www.doi.org/10.1080/00091383.2019.1606587

Gazza, E. A. (2015). Continuously improving online course design using the Plan-Do-Study-Act cycle. Journal of Online Learning and Teaching, 11(2), 291-297. Retrieved from https://jolt.merlot.org/currentissue.html

Hirsch, J. (2017). The Feedback Fix: Dump the Past, Embrace the Future, and Lead the Way to Change. Roman & Littlefield.

Hounsell, D. (2007). Towards more sustainable feedback to students. In Rethinking assessment in higher education (p. 101-13) (D. Boud and N. Falchikov, Ed.). Routledge.

Keast, D. (2020). Good morning! I’m technology and I’ll be your instructor for this course. International Journal on Innovations in Online Education, 4(3). www.doi.org/10.1615/IntJInnovOnlineEdu.2020035554

Kelly, R. (2011). Improving students’ academic engagement by collaborating with them on course redesign. Recruitment and Retention in Higher Education, 25(7), 3-6.

Martin, F., Ritzhaupt, A., Kumar, S. & Budhrani, K. (2019). Award-wining faculty online teaching practices: Course design, assessment and evaluation, and facilitation. The Internet and Higher Education, 42, 34-43. www.doi.org/10.1016/j.iheduc.2019.04.001

McGahan, S. J., Jackson, C. M., & Premer, K. (2015). Online course quality assurance: Development of a quality checklist. InSight: A Journal of Scholarly Teaching, 10, 126-140.

Morris, R. C., Parker, L. C., Nelson, D., Pistilli, M. D., Hagen, A., Levesque-Bristol, C., and Weaver, G. (2014). Development of a student self-reported instrument to assess course reform. Educational Assessment, 19, 302-320. www.doi.org/10.1080/10627197.2014.964119

Orlando, J. (2019). Top online course design mistakes. In Special Report: Online Course Design – 11 Strategies for Managing Your Online Courses. (p. 15-18) Faculty Focus by Magna Publications.

Smaldino, S. E. & Yamagata-Lynch, L. (2015). The course-in-a-box: Design issues. TechTrends, 1-77.

Stanton, K. C. & Bradley, T. H. (2013). From course assessment to redesign: A hybrid-vehicle course as a case illustration. European Journal of Engineering Education, 38(6), 687-699. www.doi.org/10.1080/03043797.2013.826181

TED. (2011, January 3). The power of vulnerability | Brené Brown [Video]. YouTube. https://www.youtube.com/watch?v=iCvmsMzlF7o

Vaughan, N. D. (2010). A blended community of inquiry approach: Linking student engagement and course redesign. Internet and Higher Education, 13, 60-65. www.doi.org/10.1016/j.iheduc.2009.10.007