Abstract

Learning Management Systems (LMS) provide a variety of tools and functions to support teaching and learning that include, but are not limited, to group chats, threaded discussions, document sharing, assignments, quizzes, grading and course evaluations. Migrating to a new LMS can be a challenge for faculty, in fact, changes in technology have been noted as one of the top ten challenges in academia. Training is an integral aspect for faculty acceptance of new technologies, particularly a new LMS. This study compared faculty reported comfort, ease of use and usability with the old (pre-test) and the new LMS (post-test) and the type and amount of training attended. In the pre-test, only the relationship between training and comfort level showed a positive correlation (t = 2.017, ρ=0.046). In the post-test results, there was no correlation between training and comfort level, ease of use or usability scores. In addition, no significance was found when controlling for years teaching or faculty rank. When comparing ease of use pre-and post-test, overall, faculty rated the new LMS more useful to their work (mean pre mean = 2.25, post mean = 2.45), and easier to use (pre mean = 182 and post mean = 2.97). While there was limited significance in this study, it is imperative to offer a variety of support methods such as 24/7 access, one-on-one support in person, and quick reference guides.

Introduction

Faculty willingness to adopt new technologies varies depending on the perceived usefulness and ease the technology offers (Buchanan, Sainter, & Saunders, 2013). This study analyzed a Learning Management System (LMS) migration, in which multiple methods of training and practice for faculty were employed. Because the university changed the LMS, faculty needed to use technology that may not have been familiar, and initially required more time to integrate into to their teaching.

Background and Significance

Learning Management Systems (LMS) provide a variety of tools and functions to support teaching and learning that include, but are not limited, to group chats, threaded discussions, document sharing, assignments, quizzes, grading and course evaluations. Since their inception, LMS’s have evolved to support complex tasks, such as importing Shareable Content Object Reference Model (SCORM) objects (Freire, Arezes, Campos, Jacobs, & Soares, 2012). LMS’s have become the foundation for teaching and learning at many universities; therefore, the adoption is important to student success (Rafi, Samsudin, & Hanafi, 2015).

Training

Migrating to a new LMS can be a challenge for faculty, in fact, changes in technology have been noted as one of the top ten challenges in academia (Ryan, Toye, Charron, & Park, 2012). Due to schedules that include research, teaching, service and administrative duties, faculty often are unable to attend training prior to adopting a new LMS. In order to support faculty needs when migrating to a new LMS, Ryan and colleagues (2012) suggested offering multiple formats of training to accommodate faculty schedules (e.g., online, face-to-face, asynchronous). Boggs and Van Baalen-Wood (2018) recommended creating a two-pronged migration strategy: (1) frame initial training around content migration and, (2) offer migration services to those who need them. They also advised creating multiple formats and levels of training, to offer custom workshops, and opportunities for one-on-one personalized sessions.

Training is an integral aspect for faculty acceptance of new technologies, particularly a new LMS. Fathema and Sutton (2013) found faculty want extensive training on the LMS. Moreover, Dahlstrom, Brooks, and Bichsel (2014) conducted a national survey that found 57% of faculty want extensive training in the LMS in order to be more effective with their teaching. Faculty training should be based on instructional design principles and focused on faculty needs (Tam, 2000). Therefore, a needs analysis should be conducted to determine the instructional gaps or specific needs of the faculty. Once the needs have been identified, specific training objectives should be developed to ensure the focus of the training. Gautreau (2011) found that training should offer active learning techniques, faculty should have input on training dates, duration, and content, supplemental materials should be available after training, and faculty should have opportunities for advanced training.

Technology Acceptance

In addition to training, technology acceptance is based on the end user’s perception of how easy it is to use and how useful it is to their work. The Technology Acceptance Model (TAM) has provided the theoretical basis for empirical studies of user technology acceptance, and predictions of end-user acceptance of an e-learning system (Arbaugh, 2002; Davis et al., 1989; Fathema & Sutton,2013; Lee, Hsieh, & Hsu, 2011; Maicana, Cazanb, Lixandroiu, & Dovleac, 2019; Rucker & Frass, 2017; Scherera, Siddiqb, & Tondeurc, 2019; Wu, Tsai, Chen, & Wu, 2006). TAM factor analysis provides support for the content and construct validity of the subscales of usefulness (10 items) and ease of use (10 items) and consistently demonstrated acceptable results (King & He, 2016).

TAM was designed to apply to any specific domain of human-computer interactions (Davis et al., 1989). TAM postulates that the two main constructs, perceived usefulness and perceived ease of use, determine technology acceptance and are key antecedents of behavioral intentions to use information technology. Perceived usefulness is the “degree to which an individual believes that a particular system would enhance job performance” (Davis, 1989, p. 320). Perceived ease of use is the “degree to which an individual believes that using a particular system would be free of effort” (Davis et al., 1989, p. 320). Perceived usefulness or “usability” varies depending on the area in which it is being studied. Technology integration is not dependent on just availability, but rather how the technology is embraced and utilized by the end user (Fathema, Shannon, & Ross, 2015).

Fathema and colleagues conducted a study on LMS adoption and found that system quality (e.g., usability, availability, reliability, adaptability, and response time) had a significant positive effect on the perceived ease of use and perceived usefulness of an LMS (2015). Their results also indicated that there was a positive effect of facilitating conditions such as training and professional development on attitudes towards using an LMS. Panda and Mishra (2007) and Wallace (2001) also found a lack of training can significantly affect faculty adoption rates for elearning systems. Training can be a significant factor for TAM, specifically with the perceived usefulness construct, as the trainers can make specific connections regarding the LMS and faculty needs.

Training Process for Migration

Before the migration, multiple training opportunities were designed and developed based on recommendations from Ryan and colleagues (2012) and Gautreau (2011). A needs analysis was not conducted prior to the migration, as the training team determined that since it was a new LMS, the trainings would be based on migrating and organizing course materials. Faculty had the opportunity to attend training through a variety of modalities including one-on-one appointments, campus-wide groups and/or through customized face-to-face departmental training. Within each modality (e.g., one-on-one or group training), faculty could receive training either face-to-face or via web conferencing. For example, if a faculty requested a one-on-one appointment, they could also request to meet online. This was designed to meet the needs of all faculty since some faculty teach from a distance. All trainings were offered frequently throughout the duration of the migration.

During the first semester of the migration, the campus-wide training was designed to cover basic information from migrating content from the old LMS to basic organization skills. Group training was designed in two modalities: blended synchronous and online asynchronous. The blended synchronous format allowed faculty to attend either face-to-face in a computer lab or attend training via web conferencing (e.g., Zoom, Webex, etc). This allowed faculty who were at a distance to receive live training and immediate feedback. The LMS trainers were instructed on how to deliver content in this format to ensure both the online and face-to-face groups received equal attention and all questions were addressed. The asynchronous online group training was designed as a two-week course. Faculty self-enrolled in the course which had course facilitators, assignments and due dates. Faculty remained in the course after the end date so they could reference the materials as they migrated their own courses. Although there was a high attrition rate for the asynchronous course, faculty noted during training surveys that they appreciated the various opportunities.

Since faculty have course specific needs, one-on-one appointments were also offered. The one-on-one appointments were led by a team of graduate assistants. Using an interactive calendar, faculty could sign up for a one-hour session. Sessions were offered multiple days of the week for the entire business day in order to reach a majority of the faculty.

Training and course preparation were also addressed individually within each fully online program. The faculty and administrative representatives were involved with the decision-making process to determine how and when each program would be migrated, as well as the type and frequency of the training needed. For example, the Educational Leadership Program agreed collectively to migrate all their courses during the Fall 2019 semester and requested training sessions at the end of the summer and beginning of the fall. This helped faculty give the students a seamless transition by ensuring that each program offered a tutorial for the students, and all courses within their fully online program migrated simultaneously. This way, students would only have to log into one LMS to access all their courses. Program leaders, faculty, and staff determined the migration timeline and training schedule offered by the University’s eLearning department. Training was offered in the preferred format of the faculty in each program.

During the second semester of the migration, the team added additional group/campus-wide training sessions based on information gathered from the faculty feedback. A survey was deployed during the end of the first semester of the migration. Based on faculty needs, these sessions focused on specific aspects of the new LMS. For example, there were additional sessions on grading and the gradebook, student engagement (discussion boards and assignments), and organization and management of LMS content. All training sessions continued to be offered via blended synchronous and asynchronous trainings and frequently throughout the semester.

Methods

The purpose of this study was to compare ease of use and usability between the old LMS and the new LMS based on the type of training modality the faculty attended. At the time of this study, approximately 50% of the campus had migrated to the new LMS.

Participants

An electronic survey was sent to all faculty who were identified by the Registrar’s office as teaching Fall 2018 semester courses, both online and face-to-face. The survey was sent to the same faculty in Spring 2019 as a post-test survey. Participation in the survey was voluntary, and participants who were already teaching on the new LMS in the Fall semester were excluded from the pre-test data.

Design

A pre-test/post-test design was used in this study. The pre-test survey was sent to faculty prior to the deadline for the LMS transition. The post-test was sent after most faculty had started the transition to the new LMS.

Procedure

Participants were emailed a survey link to the voluntary, anonymous electronic pre/post survey using Qualtrics TM survey tool. The email included a description of the purpose of the survey, methods to maintain confidentiality and their role in the study. The pre-survey link was open for two weeks in Fall 2019, to gather feedback on the current LMS prior to the new LMS training events. The post-test survey was sent in Spring 2019 and asked for feedback on the new LMS. During both pre- and post-survey periods participants received an initial email, a reminder 13 days later, and a second reminder immediately prior to the end of the data collection period.

Survey questions included fill-in style for years teaching and radio button choices for age, title, rank, type and number of courses, comfort level and type of training completed on the new LMS. Likert scale questions were used for the ease of use and usability variables based on Davis’ Technology Acceptance Model (Davis, 1989). A final open-ended question allowed participants to describe advantages and/or challenges with the LMS transition. The data were collected anonymously, and with limited identifying characteristics such as years teaching, faculty rank, and experience with either LMS. Survey results were compared only in aggregate. Data were maintained on a university owned, password protected PC. IRB approval was received from the institution of record.

Responses were uploaded into IBM SPSS v25 for qualitative analysis. Open-ended questions on perceived advantages and challenges were analyzed for themes. Aggregate findings were then shared with the LMS migration team.

Results

A total of 690 full-time faculty and 305 part-time faculty received the pre- and post-test surveys. There was a 28% response rate for the initial pre-test survey (Fall 2018) and 22% response rate for the post-test survey (Spring 2019). Eleven percent (N=109) of the participants completed both the pre-and post-test surveys. Participants who were already teaching on the new LMS in the Fall semester were excluded from the pre-test data. Of those who completed the surveys, 48.5% pre-/ 53.5% post-test were tenure track faculty (full, assistant and associate) and 51.5% pre-/ 46.5% post-test were lecturers. Faculty averaged 14 years of teaching experience. Since answers were not required for each question, not all faculty answered every question. (See Table 1 and 2 for demographics.)



Quantitative Results

The TAM model consistently has good internal consistency. For this study the Cronbach alpha scores were also very high (pre ease of use α = 94, pre usability α = 93, post ease of use α = 98 and post usability α = 97) which adds to the validity of the findings.

Usability and ease of use were each compared to faculty-reported comfort level with the LMS pre- and post-test. As expected with a new technology there was a significant decrease in comfort with the new LMS in the post-test group (pre mean = 2.53, post mean = 1.89). Higher comfort levels were associated with higher ease of use and usability scores. In other words, faculty were more comfortable with the new LMS when they perceived it easy to use and useful to their work. While there was no significant difference in ease of use between the pre- and post-test groups, there was a slight significant difference in usability scores (pre usability = 1.2, post usability = 1.4). Participants rated usability higher in the new LMS than they had in the previous LMS.The amount and form of training that faculty completed was also compared with comfort level, ease of use and usability scores. In the pre-test, only the relationship between training and comfort level showed a positive correlation (t = 2.017, ρ=0.046). In the post-test results, there was no correlation between training and comfort level, ease of use or usability scores. In addition, no significance was found when controlling for years teaching or faculty rank. When comparing ease of use pre- and post-test, overall, faculty rated the new LMS more useful to their work (mean pre mean = 2.25, post mean = 2.45), and easier to use (pre mean = 182 and post mean = 2.97). See Table 3.



Qualitative

In the open-ended responses, thirty-four comments referenced ease of use, and seven of those comments mentioned both the ease of use and training. Overall, there were four comments that indicated perceived usefulness. For faculty who attended training and wrote additional comments, there were eleven separate references to the new LMS not being easy, eight comments that mentioned the new LMS being easy to use and intuitive. There were six comments that alluded to the ease of use of the new LMS by faculty who did not attend any training. Many of the comments that mentioned they had participated in training and still found the new LMS difficult revolved around workflow. Faculty wrote statements about the specific parts of the new LMS that they found difficult including: “I think it is more difficult to upload course material into the [new LMS] but perhaps I am not doing it the best way.” “Posting grades on the [new LMS] is not intuitive.” “Calculating grades is not intuitive nor is the [new LMS] flexible in this regard.” “The [new LMS] menu can’t be edited – e.g., we MUST use the term ‘module’, every exam is a ‘quiz, etc.” “Using the [new LMS] for exams is terrible – editing the exams in the [new LMS] is darn near impossible, because it requires clicking/editing/saving one question at a time.”

For those who participated in training and found the new LMS easy, seven of the eight qualitative statements referenced the new LMS being “user-friendly” or more “intuitive”. For example, one faculty wrote “the [new LMS] is by far more user-friendly and manageable than the old LMS.” Another faculty stated, “I have found the [new LMS] to be very user friendly and every time I use it I learn something new.”

Discussion

As with most technology, acceptance is based on the end user’s experience and their determination of usefulness. Faculty were offered training on the new LMS using a variety of modalities. While there was no statistically significant difference when comparing type and amount of training with comfort level, or with ease of use or usability scores, faculty comments suggested that training, regardless of the modality, made the transition easier. Other available resources may have impacted these results. For example, the university purchased 24/7 access to the new LMS Help Desk which could be queried via chat, email, or phone. Anecdotally, faculty were appreciative of the multiple modalities of training, and reported being able to access as much training as they needed. As with all group training, the objectives may not meet the individual needs of faculty, therefore, it was imperative to incorporate one-one training sessions to allow individualized training.

Although the university included faculty in the selection of the new LMS, not all faculty were in favor of the change. However, the old LMS was discontinued at the end of the 2019 Spring semester and all courses migrated to the new LMS. Training continues to be offered to faculty in multiple modalities as needed.With technology becoming more ubiquitous, it was not surprising to find that age, years teaching and faculty rank did not impact comfort level, ease of use or usability scores. Further studies could assess personality traits, time spent in course development and delivery through the current LMS, and how much innovation in course delivery exists. Re-assessing comfort level, ease of use and usefulness after all faculty have had time to transition their courses could also give insight to faculty needs around technology. In addition, assessing student comfort with the new LMS and comparing that to faculty scores would help to address any limitations in the training offered.

Limitations

Faculty were surveyed during the transition year. Some faculty moved to the new LMS during the Fall semester and may have had a more positive outlook due to the amount of time they were using it. Those that recently switched in the Spring semester were only using the new LMS for two months when the survey was deployed. Basic training was offered to faculty that were designed and developed by graduate assistants under the guidance of a full-time eLearning department staff member. All training was also led by the graduate assistants who would not have the same knowledge, skills, and experience working with faculty to adjust the training to the audience. Therefore, training may not have been as effective as intended. Also, no training was required prior to teaching in the new LMS, which may have led to frustration of using the new LMS.

Closing

As with any change, there are multiple factors to consider. A communication plan needs to be designed prior to LMS migration to determine how to reach the intended audience (e.g., students, faculty, part-time lecturers, staff). Training on the new technology should be designed and developed prior to announcing the migration and advanced or new features training added as the end users become more comfortable. Finally, support is essential during and after any technology change. It is imperative to offer a variety of support methods such as 24/7 access, one-on-one support in person, and quick reference guides. By implementing a thorough communication, training and support plan, users will make an easier transition and may more quickly accept the new technology and processes.

References

Arbaugh, J. B. (2002). Managing the on-line classroom: a study of technological and behavioral characteristics of web-based MBA courses. Journal of High Technology Management Research, 13, 203-223.

Boggs, C., & Van Baalen-Wood, M. (2018). Diffusing Change: Implementing a University-Wide Learning Management System Transition at a Public University. In Leading and Managing e-Learning (pp. 115-128). Springer, Cham.

Buchanan, T. Sainter, P., & Saunders, G. (2013). Factors affecting faculty use of learning technologies: Implications for models of technology adoption. Journal of Computing in Higher Education, 25 (1) .1 – 11. DOI: 10.1007/s12528-013-9066-6

Dahlstrom, E., Brooks, D. C. & Bichsel, J. (2014). The current ecosystem of learning management systems in education: Student, faculty, and IT perspectives. Research report. Louisville, CO: ECAR. http://educause.edu/ecar.

Davis, F. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(9), 319-340.

Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical model. Management Science, 35(8), 982-1003.

Fathema, N., Shannon, D., & Ross, M. (2015). Expanding The Technology Acceptance Model (TAM) to Examine Faculty Use of Learning Management Systems (LMSs) In Higher Education Institutions. Journal of Online Learning & Teaching, 11(2).

Fathema, N., & Sutton, K. (2013). Factors influencing faculty members’ Learning Management Systems adoption behavior: An analysis using the Technology Acceptance Model. International Journal of Trends in Economics Management & Technology, 2(6), 20-28.

Freire, L., Arezes, P., Campos, J., Jacobs, K., & Soares, M. M. (2012). A literature review about usability evaluation methods for e-learning platforms. Work 41, 1038-1044. doi: 10.3233/WOR-2012-0281- 1038

Gautreau, C. (2011). Motivational factors affecting the integration of a learning management system by faculty. Journal of Online Educators, 8(1), xxx-xxx

King, W. & He, J. (2016). A meta-analysis of the technology acceptance model. Information & Management, 43, 740–755.

Jaschik, S. & Lederman, D. (2014). The 2014 Inside Higher Ed Survey of faculty Attitudes on Technology: A Study by Gallup and Inside Higher Ed. Washington, DC: https://www.insidehighered.com/news/survey/online-ed-skepticism-and-self-sufficiency-surveyfaculty-views-technology.

Lee, Yi-Hsuan, Hsieh, Yi-Chuan & Hsu, Chia-Ning. (2011). Adding Innovation Diffusion Theory to the Technology Acceptance Model: Supporting Employees' Intentions to use E-Learning Systems. Journal of Educational Technology & Society, 14(4), 124-137.

Maicana, C., Cazanb, A., Lixandroiu, R., & Dovleac, L. (2019). A study on academic staff personality and technology acceptance: The case of communication and collaboration applications. Computers & Education, 128, 113–131.

Rafi, A., Samsudin, K., & Hanafi, H. F. (2015). Differences in Perceived Benefit, Use, and Learner Satisfaction between Open Source LMS and Proprietary LMS. In E-Learning-Instructional Design, Organizational Strategy and Management. IntechOpen.

Rucker, R. & Frass, L. (2017). Migrating learning management systems in higher education: Faculty members’ perceptions of system usage and training when transitioning from Blackboard Vista to Desire2Learn. Journal of Educational Technology Systems, 46(2), 259-277.

Ryan, T., Toye, M., Charron, K., & Park, G. (2012). Learning management system migration: An analysis of stakeholder perspectives. International Review of Research in Open & Distance Learning, 13(1), 220–237.

Pajo, K. & Wallace, C. (2001). Barriers to the Uptake of Web-based Technology by University Teachers. The Journal of Distance Education, 16(1), 70-84.

Panda, S., & Mishra, S. (2007). E-Learning in a Mega Open University: Faculty attitude, barriers and motivators. Educational Media International, 44(4), 323-338. doi: 10.1080/09523980701680854.

Scherera, R., Siddiqb, F., & Tondeurc, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128, 13–35.

Wu, J. P., Tsai, R. J., Chen, C. C., & Wu, Y. C. (2006). An integrative model to predict the continuance use of electronic learning systems: Hints for teaching. International Journal on E-Learning, 5(2), 287-302.