Abstract

This paper shares the results of a longitudinal, descriptive study of 675 online students’ expectations and experiences across five years (2016-2021) at an institution in the southern U.S. that is highly ranked in online support services and programmatic offerings. Data included demographic information, open-ended responses, cumulative GPA, and final academic status. Results both reinforced and deviated from extant literature, with an important distinction being the differentiation between students in blended and online programs. Findings revealed a mismatch between expectations and actual experiences for both online and blended students. Implications for retention are discussed.

Introduction and Background

Distance education leaders believe most students will take both online and face-to-face (FTF) courses as part of their future college experience (Garrett et al., 2022), but a high attrition rate for online courses complicates the situation (Bawa, 2016). Research that identifies factors associated with retention is often contradictory. For instance, some researchers believe there is no ‘traditional’ online learner, and that learner demographics vary based on contextual characteristics such as institution type, location, cost, and academic major (Aslanian & Clinefelter, 2012; Ortagus, 2017; Xu & Jaggars, 2013; Yukselturk & Bulut, 2007). Other researchers identify common characteristics of online learners such as being White, female, and older; having graduate classification; possessing previous experience with online courses; and living within 100 miles of the physical campus (Boghikian-Whitby & Mortagy, 2008; Garrett et al., 2022; Magda & Aslanian, 2018; Magda & Smalec, 2020; Tichavsky et al., 2015).

Contradictions continue when considering the quality of online student learning compared to FTF courses. Some researchers conclude the different formats are equal (Layne et al., 2013; Wells et al., 2022) while others find online courses inferior (Ganesh et al., 2015). Student characteristics may be one reason for the differing views of learning outcomes. For instance, females have been found to have better academic performance than males in online courses (Layne et al., 2013). In addition, higher self-efficacy, an internal locus of control, clear personal goals, and higher self-regulation behaviors are all identified as beneficial to online learning outcomes (Bell, 2007; Cho & Shen, 2013; Shen et al., 2013; Yukselturk & Bulut, 2007). Other student characteristics are negatively associated with online learning outcomes such as having apprehensions about online learning, which is identified as more common in students of color (Ashong & Commander, 2012; Okwumabua et al., 2011). Students from disadvantaged backgrounds can also experience achievement gaps like those observed for FTF courses (Newell, 2007; Xu & Jaggars, 2013). Some academic practices may mitigate negative influences on learning outcomes such as providing online students with an orientation prior to starting courses (Stoebe, 2022) and increasing engagement opportunities with peers and instructors (Heyman, 2010; Martin & Bolliger, 2018; Platt et al., 2014; Tichavsky et al., 2015).

While existing literature guides understanding of online learners, studies are often either based upon data collected at one point in time or lack student perspectives to provide context for online performance. Within the online education literature, only six studies were identified to have tracked student data longitudinally and they varied in approaches. For instance, five studies focused exclusively on quantitative measures such as demographic and academic data (Boghikian-Whitby & Mortagy, 2008; Boston et al., 2012; Layne et al., 2013), or self-report surveys using Likert scale questions (Fish & Snodgrass, 2021; Pate & Miller, 2012). Layne et al. (2013) criticized online learning studies that only considered demographic data without also including behavior and experiential factors, yet their study only expanded data collection to measures such as academic program, credits attempted, and credits earned.

The final longitudinal study offered a mixed methods perspective of a multi-country, online language-learning program (Charbonneau-Gowdy, 2018). When less than half of students passed the program, instructors observed that students focused on asynchronous activities and failed to engage in synchronous video sessions. Students revealed their lack of engagement was due to technology issues, discomfort with live interaction, and a preference for independent work. The addition of qualitative measures provided increased understanding of student performance in the program.

To better understand online learning experiences, a full range of data (demographic, academic, self-report experiences) must be evaluated over time. This descriptive, mixed methods study emphasized student demographics, academic performance, and student self-report data at one U.S. institution over a five-year period (2016 to 2021). Participants included undergraduate and graduate students from a variety of academic majors and diverse backgrounds, and all were enrolled in either fully online academic programs or blended programs containing both FTF and online courses. While the longitudinal project encompassed a wide variety of topics, only four of the research questions are addressed in this study:

  1. What are student perceptions about distance education prior to taking online courses?
  2. How are students oriented to online learning?
  3. How do student expectations about online courses compare to their actual experiences?
  4. What are the academic performance differences between students who take online courses and those who do not?

Methods

The institution of study is a four-year regional university in the southern United States which has a nationally recognized online support office and receives top rankings for its online programs. When data collection began, enrollment at the institution was over 21,000 students with 62% of all students self-identifying as first-generation and approximately half coming from historically underrepresented populations. The institution offered more than 40 fully online programs at both the undergraduate and graduate levels. Around 20% of all students enrolled in fully online programs and 60% took at least one online course during one of the academic terms in the study.

Data Collection

After receiving IRB approval, a locally-created, anonymous, web-based survey was distributed to all first-time students who registered for at least one online course at the institution for fall 2016 (Cohort 1) or fall 2017 (Cohort 2). Approximately 1,800 students received the initial survey in each of the fall terms. The survey, sent prior to the start of the fall term, asked students about their experiences with, beliefs about, and expectations for online learning. Students could opt into the longitudinal project at the end of the survey. In total, 675 students volunteered for the project and became the sample for this study. The students permitted collection of demographic information and academic data (cumulative grade point average, major, academic status). These data were obtained for all 675 participants each fall and spring term throughout the project and served as the basis of the longitudinal research.

In addition, locally-created web-based surveys were sent to all participants at the conclusion of each fall and spring term to obtain self-report data regarding online experiences and perceptions. These responses provided context for the longitudinal data. For instance, in the initial survey, participants indicated their beliefs about online learning, compared FTF and online courses across several statements, indicated what institutional services and activities they expected to participate in, and shared expectations for synchronous classroom activities. Subsequent surveys asked what participants had experienced (i.e., “Which of the following activities have you done this term?” “Were synchronous (live, real-time) activities experienced or planned with instructors and/or peers in your online course(s) this semester?”). The surveys were voluntary, so response rates varied each term and each survey was approved by the institution’s IRB prior to administration.

While the research plan included distributing the surveys for five years, the COVID-19 pandemic changed the nature of online education with only one year left in the project. A decision was made to end the surveys after the spring 2020 term to preserve participants’ pre-pandemic online experiences. For that reason, survey data only addresses the first five academic terms experienced by both participant cohorts. However, collection of academic and demographic data continued throughout the planned five-year period.

Participant Demographics

A total of 675 students agreed to participate in the longitudinal project. Table 1 illustrates the overall demographics along with cohort information. Race and ethnicity were reported using the same categories as the Integrated Postsecondary Education Data System (IPEDS). Cohort 1 included more full-time undergraduate students with a larger proportion of first-generation and low-income students while cohort 2 included more older, part-time, graduate students.


Table 1: Overall and Cohort Demographics

Demographics

Overall

(n=675)

Cohort 1

(n=466)

Cohort 2

(n=209)

Sex
     Female78%80%72%
     Male22%20%28%
Race/Ethnicity
     African American14%15%10%
     Native American<1%<1%<1%
     Asian3%3%2%
     Hispanic23%23%22%
     White55%53%61%
     Multi-Racial3% 3% 1%
     International2% 2% 2%
     Unknown1% 1% 1%
Classification
    Undergraduate55%68%26%
    Graduate45%31%74%
Student Status
    Full-time student54%61%36%
    Part-time student44%36%62%
    Not available  2% 3% 2%
First-Generation Status44%55%18%
Low-Income Status33%40%17%
Age (mean)27.9 (±10.6)26.2 (±9.6)31.5 (±11.6)
Age range17 to 6317 to 6317 to 60
Online Status
      Fully Online57%51%57%
        Blended43%49%43%

As shared in Table 2, undergraduate participants were almost evenly split between six of the seven academic colleges. Compared to institutional enrollment, education and humanities/social sciences were overrepresented in the sample by 11% and 9% respectively with a corresponding underrepresentation for science and engineering technology. For graduate participants, over half were enrolled in the college of education, which was 10% higher than the institutional enrollment. There was a corresponding underrepresentation in both fine arts and humanities and social sciences. 

Table 2: Participants by Academic College

Academic CollegeUndergraduate (n=366)Graduate (n=301)
Business15%11%
Criminal Justice15%13%
Education16%55%
Fine Arts4%<1%
Health Sciences18%3%
Humanities and Social Sciences18%10%
  Science and Engineering     Technology14%8%

Data Analysis

For research questions 1 and 3, qualitative data were analyzed by two of the researchers through inductive coding (Creswell & Poth, 2018). Using a constant comparison process, the first researcher performed the initial review of each response, identified significant statements, and assigned a descriptive code to each statement. The descriptive codes were reviewed by the second researcher then discussed to clarify interpretation. The researchers condensed the descriptive codes into categories and then final themes. A final round of analysis examined themes by all participant characteristics to identify possible response patterns.

For research questions 1, 2, and 3, simple frequencies and descriptive statistics provided an overview of demographic, academic, and survey data. A chi square test was used to evaluate differences between the categorical variables. Groups were compared using an α = .05. The Cramer’s V statistic was used to evaluate the magnitude of group differences (Lemercier & Zalc, 2019). Due to the size of the study population, statistically significant differences were noted only when a 10% or more variation occurred between groups. This strategy emphasized differences that may also have implications for practice. Comparisons resulting in no differences were noted in the findings to provide clarity.

For research question 4, a multiple regression analysis was conducted to examine the relationship between all participant characteristics (cohort, sex, race/ethnicity, first-generation status, low-income status, type of program (fully online or blended), previous online experience) and final term GPA. Multiple regression is appropriate for evaluating interval level outcomes with both categorical and interval level predictors (Pedhazur, 1997). Because the GPA of undergraduate and graduate students were not directly comparable, separate analyses were conducted for both groups. Categorical variables were dichotomous and did not require further coding schemes (i.e., variables were dummy coded).

An a priori power analysis was conducted to determine the minimum required sample size (1-β = .80) to detect statistical differences. The minimum required sample size given the number of predictors (k = 7) in the model was 103. Sample sizes in this study were double the minimum required to detect statistical differences and were determined to be appropriate for analysis. Both p values and effect sizes were considered in the interpretation of the overall model. Standardized regression weights (β) and squared structure coefficients ( ) were used to evaluate relative importance of predictors (Ziglari, 2017). Squared structure coefficients provided information about the proportion of the overall effect size that can be explained by each predictor alone.

Findings

More than half of participants (56%) reported previous online course experience prior to enrolling at the institution but demographic differences existed for sex, race/ethnicity, and program type. Participants who more likely to have reported previous online experience included females (59% versus 48% for males) [χ2(1, n = 675) = 6.075, p = 0.014], White students (64% versus 47% for students of color) [χ2(1, n = 656) = 19.278, p <0.001], and students in fully online programs (66% versus 47% for students in blended programs) [χ2(1, n = 671) = 23.885, p<0.001]. Effect sizes were small for all three group differences (.10, .17, .19, respectively). No differences were found between groups based on classification level, first-generation status, or socioeconomic status.

RQ1: Perceptions of Online Learning

Prior to starting courses at the institution, participants indicated their beliefs regarding online courses compared to FTF courses. As Table 3 demonstrates, they perceived online courses requiring more self-discipline and independence than FTF courses and offering fewer opportunities for interactions with peers and instructors.

Table 3: Perceptions of online courses compared to FTF courses

OutcomeMoreLessSame
Amount of self-discipline needed to complete courses84%2%14%
Amount of independence needed to complete courses81%1%18%
Student effort to complete assignments46%9%46%
Student effort to complete course readings44%14%42%
Level of stress for students36%25%39%
Amount of time to complete academic work34%10%57%
Academic challenge of coursework33%5%62%
Amount of learning achieved by students19%14%68%
Instructor support for students19%38%43%
Instructor interactions with students15%62%24%
Amount of understanding about my chosen career field14%13%73%
Opportunities for group work in class5%70%25%
Opportunities to make social connections with classmates5%82%13%
  Opportunities to become acquainted with classmates4%82%13%

Compared to participants in blended programs, those in fully online programs believed online courses required more effort to complete readings (62% versus 38%) [χ2(2, n=652) =25.797, p<0.01] and generated more stress (60% versus 40%) [χ2(2, n=653) =10.081, p<0.01]. Both differences had small effect sizes (.20 and .12 respectively). In addition, 49% of participants with previous online experience believed online courses required more effort to complete readings compared to 35% of participants with no previous experience [χ2(2, n=656) =13.313, p=0.001], which had a small effect size (Cramer’s V = .14). No other demographic differences were detected for any statement, indicating consistency in beliefs prior to starting online courses at the institution.

In the qualitative comments, participants described what they were most apprehensive about regarding online courses, and 554 participants offered 855 unique ideas. No response patterns were observed for any participant characteristics. More than half of the comments (53%) addressed course-related issues that aligned well with Moore’s (1989) Transactional Distance Theory by focusing on instructor, peer, and content engagement. Most of the comments described apprehensions surrounding course content and how instructors managed the course through organization and direct communication. A smaller portion of participants wanted to meet fellow students and informally collaborate with them while avoiding formal group projects. For course content, some participants reported apprehension about not being able to understand information without the immediacy of in-person communication. Another 29% of comments discussed self-management behaviors (i.e., managing time, balancing multiple responsibilities, working independently), while 12% expressed apprehension related to participants’ lack of online experience or past negative experiences. The final 6% of comments came from participants who had no apprehensions about online learning, a sentiment primarily expressed by those with previous online experience.

Instructor presence was a dominant theme shared by many participants who anticipated limited or delayed interactions. For instance, one future online graduate student remarked, “I am a little afraid to bother the professor with too many questions,” which seemed to place boundaries around personal behavior based on anticipated transactional distance. Another student stated, “I am most worried about getting stuck on a new topic and not know[ing] what it is I don’t know. Without a professor, I feel as though it would be a difficult situation to resolve” (blended undergraduate). The comment implied online courses lacked active instructors.

For what they were anticipating about online learning, 562 participants offered 854 unique ideas. The only observed pattern regarding participant characteristics was for online experience. Around two-thirds (64%) of the comments were offered by participants who had taken online courses prior to enrolling at the institution, and most of their comments highlighted being able to work from anywhere and conserving time within busy schedules. The remaining comments mostly addressed personal benefits such as learning the course topic and increasing self-management, but a few participants noted that they were forced to enroll in online courses because it was the only delivery format offered and they were not looking forward to the experience.

RQ2: Orientation Experiences

On their first fall term survey, participants indicated what orientation activities they experienced from a provided list. The list included common activities offered to new online students at the institution. Table 4 reveals more participation in asynchronous options. Almost all participants (95%) experienced at least one type of orientation activity in their first term.

Table 4: Orientation Activities Experienced by Participants (n=148)

ActivityPercentage
Written Communication from Academic Program (handout, document, email)46%
Get Acquainted Assignments the First Week in Class38%
Self-Paced Video Orientation from Faculty or Program35%
Institution Self-Paced Online Orientation28%
Live Orientation with Faculty or Program16%
  Face-to-Face Orientation on Campus15%

Students in fully online programs were more likely to participate in the institution’s self-paced orientation (35% vs. 18% for blended programs) [χ2(1, n=148)=4.906, p=0.027], self-paced videos from their faculty or program (42% vs. 21%) [χ2(1, n=148)=6.773, p=0.009], and 
“get acquainted” assignments in class (45% vs. 27%) [χ2(1, n=148)=4.678, p=0.031]. Students in blended programs were more likely to attend the FTF orientation (27% vs. 8% for fully online programs) [χ2(1, n=148) =10.116, p=0.001]. All effect sizes were small (Cramer’s V = .18, .21, .18, .26, respectively). Similar patterns existed for classification with undergraduates more likely to attend FTF orientation and graduate students more likely to engage in self-paced activities. No cumulative GPA differences were observed for the different orientation activities.

RQ3: Expectations Versus Experiences
Before they began courses, participants were asked about expected engagement in specific activities including synchronous activities in online courses, visiting the physical campus, and using specific online resources. In subsequent surveys, participants indicated what they experienced for these activities. One additional question asked participants what online resources they may have used during the academic term. Each of the sections below provide an overview of their responses.

Synchronous Activities

Prior to starting courses, 69% of participants expected at least one synchronous session in their online courses. Participants with no online experience expected synchronous sessions more than those with online experience (79% versus 62%) [χ2(1, n=666) =20.237, p<0.001], but the effect size was small (Cramer’s V = .17). No other demographic differences were detected for expectations. When actual experiences were analyzed, 25% to 44% of participants reported having synchronous sessions in their online courses across five terms. Graduate students reported more synchronous activities (67% to 94%) than any other demographic group, but their qualitative comments indicated many did not want the live activities. The opposite pattern was identified for undergraduates who wanted synchronous activities but did not receive them. No other demographic differences were observed for experiences.

Campus Activities and Services

Prior to starting courses at the institution, participants indicated expectations for engaging with different campus services and activities then reported actual experiences on subsequent surveys. Table 5 illustrates expectations often did not align with experiences. The only activity consistently matching expectations was interacting with peers outside of class.

Table 5: Expectations of and experiences with campus activities and services

Online ActivityExpected (n=675)Experienced Fall 1 (n=143)Experienced Spring 1 (n=181)Experienced Fall 2 (n=56)Experienced Spring 2 (n=127)Experienced Fall 3 (n=41)

Academic Advising

72%27%28%30%24%17%

Interacting with faculty outside of class

49%50%36%61%37%76%

Interacting with other students outside of class

45%59%46%64%50%61%

Join a student organization

40%11%16%13%12%17%

Research projects with faculty (not associated with course activities)

39%13%20%23%19%24%
  Attend     campus events35%17%19%7%17%17%

Participants also reviewed a list of other online resources and indicated any they had used during the term. They primarily accessed self-service resources such as those offered by the campus library (59% to 65% across five terms) or involving personal data such as degree plan information (52% to 76%). Only 15% of participants sought assistance from the campus writing center or tutoring services for the first fall term, then usage dropped for subsequent terms (2% to 9%). Other resources had consistently low usage such as career advising (6% to 7%), service-learning (2% to 6%), and mentoring programs (0% to 2%). Participants were more likely to access online services during their first fall (27%) and spring (27%) terms compared to subsequent terms (16% to 18%).

Geographic Distance and Campus Visits

Based on permanent addresses listed with the institution, 48% of participants lived within 50 miles of the campus. The portion increased to 61% when the radius expanded to 100 miles. Participants living within the 100-mile radius were more likely to be in blended programs (70% versus 54% in fully online) [χ2(1, n = 670) = 19.817, p = 0.01] and have first-generation status (68% versus 57% for non-first-generation) [χ2(1, n = 674) = 8.449, p = 0.004]. Both differences had small effect sizes (Cramer’s V = .17 and .11, respectively). No other demographic differences were observed.

Across five academic terms, 28% to 39% of responding participants indicated they visited the physical campus. Undergraduates visited the campus in greater proportions (52% to 68% across five terms) than graduate students (16% to 26%), with similar patterns emerging for students in blended programs (56% to 69%) compared to students in fully online programs (14% to 25%). More than half of participants who lived within 100 miles (61%) reported visiting the campus at least once. In addition, participants with no previous online experience visited the physical campus in their first fall term more than those who had online experience (41% to 22%) [χ2(1, n=143) =5.865, p=0.015]. However, the difference had a small effect size (Cramer’s V = .20) and did not appear in any other term.

Participants who visited campus were asked what they did during the visit. They described engaging in campus activities, visiting campus locations like the bookstore or library, or meeting with faculty or staff. Participants who did not visit the physical campus explained they wanted to avoid traffic and other travel issues, or that they had no desire to visit. It should be noted that some students reported geographic distance as a conflict in online courses. For example, one graduate participant in a blended program described group projects being assigned in online courses and having other members arrange in-person meetings:

In my experience, it’s often other students who are able to meet on campus (I live two hours away), and I get left out. Because of this, my grades in such classes suffer because these group projects are often a large portion of the overall grade for the class.

The comment illustrated a possible tension between blended and fully online students that could have implications for instruction and sense of belonging.

RQ4: Academic Status and GPA

By the end of data collection, 64% of participants completed their degrees, which was similar to the overall graduation rates reported by the institution. Another 20% of participants dropped out of the institution for unknown reasons, and 6% were still enrolled in courses. The final 10% experienced academic probation, suspension, or termination during the project and did not return to the institution. Participants with a previous online experience were more likely to graduate than those without an online experience (70% versus 56%) [χ2(3, n = 675) = 15.644, p = 0.001], but the difference had a small effect size (Cramer’s V = .15). No other demographic differences were identified.

For the regression analysis, participants were compared separately by classification level (undergraduate and graduate). Prior to the analysis, international students, as well as participants having no recorded GPA, an unknown race/ethnicity, or an unknown classification were removed. Table 6 offers a summary of demographic information by participants’ classification level (undergraduate/graduate).

Table 6: Demographic Characteristics of Students (N = 623)

Undergraduate (N = 344)Graduate (N = 279)
VariablesN%N%
Cohort
Fall 201629686.013347.7
Fall 20174814.014652.3
Men6318.37326.2
Women28181.720673.8
First Generation19957.88028.7
Pell Eligible17250.03914.0
Online12436.021276.0
Previous Online Experience18453.517663.5
Race/Ethnicity
Students of Color16748.510537.6
  White  177  51.5  174  62.4
MSD
  GPA  3.03  0.76  3.73  0.41

Note: The reference group was fall 2017, men, non-first generation, non-Pell eligible, and no previous online experience. The sample was also compared to students of color to avoid a deficit perspective among historically marginalized populations.

For final GPA, a regression model was conducted first for the sample of undergraduate students. The mean GPA for undergraduate students was 3.03 (SD = 0.76). Collectively, the combination of predictors in the model explained 3% of individual differences in the overall undergraduate GPA of participants ( = .03). This model, however, was not statistically significant (F[7.336] = 1.47, p = .18). A summary of all regression model weights is provided in Table 7.

Table 7: Regression Model Summary for Undergraduate Students

BSEβtpr2s
(Constant)3.230.2016.44<.01
Cohort<0.010.13<.01<0.011.00<.01
Sex-0.130.11-.07-1.21.23.10
First Generation-0.070.09-.05-0.80.42.16
Pell-Eligible-0.150.09-.10-1.70.09.48
Online Status0.090.09.060.96.34.11
Previous OL Experience0.060.09.040.68.50.09
  White  -.010  0.09  -.06  -1.14  .26  .30

The mean overall GPA for graduate students in the data was 3.73 (SD = 0.41). The same variables used in model 1 (undergraduate students) were also used to predict differences in graduate GPA. Overall, this model was statistically significant (F [7. 2690] = 3.61, p <.01) and explained 9% of differences in overall graduate GPA ( = .09). Both Pell eligibility [β=-.21, p<.01] and ethnicity [β=.-.19, p<.01] were statistically significant predictors in the model. Students who reported being Pell-eligible had an overall graduate GPA (M=3.53, SD=0.58) that was approximately 0.25 points lower than those who were not Pell-eligible (M=3.76, SD=0.36). White students also had a lower overall graduate GPA (M=3.62, SD=0.48) compared to students of color (M=3.80, SD=0.34). Both variables were able to explain about half of the overall model effect size by themselves ( =.47). 

Table 7: Regression Model Summary for Undergraduate Students

BSEβtpr2s
(Constant)3.230.2016.44<.01
Cohort<0.010.13<.01<0.011.00<.01
Sex-0.130.11-.07-1.21.23.10
First Generation-0.070.09-.05-0.80.42.16
Pell-Eligible-0.150.09-.10-1.70.09.48
Online Status0.090.09.060.96.34.11
Previous OL Experience0.060.09.040.68.50.09
  White  -.010  0.09  -.06  -1.14  .26  .30


Participants shared similar beliefs about the demands of online learning and very few differences were identified for academic performance. While previous online experience seemed to have a positive influence on degree completion rates, final GPA differences were noted only on the graduate level and only for Pell-eligibility and race/ethnicity.

Discussion

This longitudinal, mixed methods descriptive study explored student perceptions and expectations compared to actual experiences. Several findings aligned well with existing literature such as participants believing online education offers fewer interactions with peers and instructors while requiring the same or more effort for academic performance (Martin & Bolliger, 2018; Platt et al., 2014; Tichavsky et al., 2015), and familiar perceptions about the convenience of online learning (Magda & Aslanian, 2018). Findings also supported no gender differences for GPA (Layne et al., 2013); a positive relationship between previous online learning experience and degree completion (Boghikian-Whitby & Mortagy, 2008); and a negative relationship between low-income status and GPA (Mead et al., 2020).

There were also departures from previous studies. First, this study contradicted existing conclusions that students of color are more likely to express apprehension about online courses (Ashong & Commander, 2012; Okwumabua et al., 2011). Second, findings indicated no significant relationships between orientation experiences and academic performance as identified by Stoebe (2022), or between age and GPA as shared by Boghikian-Whitby & Mortagy (2008). Third, unlike Magda and Aslanian (2018), this study identified undergraduates, students in blended programs, and those with first-generation status as being more likely to live near the physical campus. Finally, the findings demonstrated graduate students of color performed better academically than their White peers, which contradicts Athens (2018).

A few new contributions regarding expectations of not meeting experiences emerged from this study. First, while participants expressed common apprehensions about the lack of instructor presence (Heyman, 2010), some adopted a passive learning role and expected the instructor to initiate engagement. This indicates misunderstandings regarding the importance of students participating in the learning environment. Second, some blended participants shared frustration about having to take a fully online course. Ferrer et al. (2020) suggested that student attitude matters for success in online courses so forcing unwilling students into a specific instructional modality could be detrimental to academic success and retention. Third, while many participants anticipated utilizing online campus services such as academic advising, only a fraction of participants sought these opportunities while enrolled. Those who did use online services did so primarily during the first fall term or as a self-service option. Finally, students in blended programs, who tended to hold undergraduate status and have no previous online experience, expected more synchronous activities but those opportunities ultimately were not present. Comparatively, fully online students, who tended to have graduate classification and previous online experience, did not expect or want many synchronous activities and reported more synchronous activities than blended students. And when blended and fully online students took online courses together, the differing expectations created engagement challenges. These examples demonstrate a disconnect between what institutions offer and what students expect and experience.

Institutional leaders can adopt several strategies to manage the mismatch between expectations and experiences. First, orientation should be mandatory for any student enrolled in online courses (Jones, 2013; Stoebe, 2020). Leaders could require completion of a mandatory, self-paced orientation placed within the institution’s learning management system (LMS) which would introduce students to the LMS and create an opportunity to share information such as differences between online and FTF learning, typical online course activities and how to use the LMS for these activities, and research regarding best online student practices. Optional synchronous and live activities could then be offered by departments and programs to better connect students to specific academic information.

Second, leaders could encourage more student use of online services by creating self-paced resources. Campus offices could create short videos or downloadable documents for students to access on-demand. Embedding these resources in the LMS could increase awareness of existing offices and provide immediate guidance for common challenges. In addition, instructors who teach online students in their first or second academic term could require use of online services as part of a class assignment to encourage familiarity and continued use.

Third, institutional leaders should clearly communicate instructional delivery format when recruiting new students. For example, some students may avoid blended programs that require online-only options. Ensuring prospective students are aware of instructional formats before enrolling allows programs to recruit students who understand expectations and can make informed decisions related to self-management.

Finally, instructor training should include an overview of what techniques may be appropriate for specific student needs. For instance, instructors in blended programs should communicate expectations about group meetings so no members are excluded. Instructors could also emphasize proactive learning behaviors for students new to online learning so they can implement good practices.

Limitations

Several limitations should be noted for this study. First, it is possible that the effects of the COVID-19 pandemic could have influenced academic performance and enrollment choices during the last year of the project. Second, the pandemic prompted participants enrolled in blended programs to move to fully online programs for the last year of the project. These changes may have impacted participants in unknown ways. Finally, while academic data were obtained from the institution, survey data relied on self-report information and response rates varied each term. This may have resulted in response bias and an incomplete understanding of experiences in online courses. All of these issues should be considered before findings are applied in other contexts.

Conclusion and Future Research

This study provides better understanding of who online students are, what expectations they bring with them to online learning, and how they engage with their institutions. The timing of data collection prior to COVID-19 allows readers to understand online student experiences without impact from the pandemic. As online education settles into a post-pandemic environment, these findings can help institutions increase retention of students who may move regularly between online and FTF learning (Garrett et al., 2022). Future research could focus more on students in blended programs. Understanding how they build learning capacity, engage with the campus community, and experience online instruction could inform how to best serve this growing population. In addition, future researchers might consider how student expectations and experiences have changed since the start of COVID-19, and how the increase in online learning has impacted the way students interface with their institutions.

The authors declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

Athens, W. (2018). Perceptions of the persistent: Engagement and learning community in underrepresented populations. Online Learning, 22(2), 27-58. https://doi.org/10.24059/olj.v22i2.1368

Ashong, C. Y., & Commander, N. E. (2012). Ethnicity, gender, and perceptions of online learning in higher education. Journal of Online Learning and Teaching, 8(2), 98-110.

Aslanian, C. B., & Clinefelter, D. L. (2012). Online college students 2012: Comprehensive data on demands and preferences. The Learning House, Inc.

Bawa, P. (2016). Retention in online courses: Exploring issues and solutions - A literature

review. Sage Open, 6(1). https://doi.org/10.1177/2158244015621777

Bell, P. D. (2007). Predictors of college student achievement in undergraduate asynchronous

web-based courses. Education, 127(4), 523-534.

Boghikian-Whitby, S., & Mortagy, Y. (2008). The effect of student background in e-learning –

longitudinal study. Issues in Informing Science and Information Technology, 5, 107-126.

Boston, W., Ice, P., & Burgess, M. (2012). Assessing student retention in online learning

environments: A longitudinal study. Online Journal of Distance Learning Administration, 15(11), 1-6.

Charbonneau-Gowdy, P. (2018). Beyond stalemate: Seeking solutions to challenges in online

and blended learning programs. The Electronic Journal of e-Learning, 16(1), 56-66.

Cho, M. H., & Shen, D. (2013). Self-regulation in online learning. Distance Education, 34(3),

290-301. https://doi.org/10.1080/01587919.2013.835770

Cohen, J., Cohen, P., West, S.G., and Aiken, L.S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd Ed.). Lawrence Earlbaum Associates.

Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry and research design (4th ed.). Sage.

Ferrer, J., Ringer, Al, Saville, K., Parris, M. A., & Kashi, K. (2020). Students’ motivation and

engagement in higher education: The importance of attitude to online learning. Higher Education, 83, 317-338. https://doi.org/10.1007/s10734-020-00657-5

Fish, L. A., & Snodgrass, C. R. (2021). Online student perceptions of effective course activities:

A longitudinal study. Business Education Innovation Journal, 13(1), 68-76.

Ganesh, G., Paswan, A., & Sun, Q. (2015). Are face-to-face classes more effective than online classes? An empirical examination. Marketing Education Review, 25(2), 67-81.

Garrett, R., Simunich, B., Legon, R., & Fredericksen, E. E. (2022). CHLOE 7: Tracking online learning from mainstream acceptance to universal adoption. The Changing Landscape of Online Education. https://encoura.org/project/chloe-7/

Heyman, E. (2010). Overcoming student retention issues in higher education online programs.

Online Journal of Distance Learning Administration, 8(4), 1-12.

Jones, K. (2013). Developing and implementing a mandatory online student orientation. Online

Learning Journal, 17(1), 43-45.

Layne, M., Boston, W. E., & Ice, P. (2013). A longitudinal study of online learners: Shoppers,

swirlers, stoppers, and succeeders as a function of demographic characteristics. Online Journal of Distance Learning Administration, 16(2), 1-12.

Lemercier, C., & Zalc, C. (2019). Quantitative methods in the humanities: An introduction.

University of Virginia Press.

Magda, A. J., & Aslanian, C. B. (2018). Online college students 2018: Comprehensive data on demands and preferences. The Learning House, Inc. https://distance-educator.com/online-college-students-2018-comprehensive-data-on-demands-and-preferences/

Magda, A. J., & Smalec, J. S. (2020). Student perspectives on online programs: A survey of learners supported by Wiley Education Services. Wiley edu LLC.

Martin, F., & Bolliger, D. U. (2018). Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning, 22(1), 205-222. https://doi.org/10.24059/olj.v...

Mead, C., Supriya, K., Zheng, Y., Anbar, A. D., Collins, J. P., LePore, P., & Brownell, S. E.

(2020). Online biology degree program broadens access for women, first-generation to college, and low-income students, but grade disparities remain. PloS one, 15(12). https://doi.org/10.1371/journal.pone.0243916

Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3(2), 1–6.

Newell, C. C. (2007). Learner characteristics as predictors of online course completion among nontraditional technical college students (Unpublished doctoral dissertation). University of Georgia, Athens, GA.

Okwumabua, T. M., Walker, K. M., Hu, X., & Watson, A. (2011). An exploration of African American students’ attitudes toward online learning. Urban Education, 46(2), 241-250. https://doi.org/10.1177/0042085910377516

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. The Internet and Higher Education, 32, 47-57.

Pate, M. L., & Miller, G. (2012). A longitudinal study of learner characteristics and experiences

with a distance Master of Agriculture degree program. NACTA Journal, 56(1), 28-34

Pedhazur, E. J. (1997). Multiple regression in behavioral research: Explanation and prediction (3rd ed.). Wadsworth.

Platt, C. A., Raile, A. N. W., & Yu, N. (2014). Virtually the same?: Student perceptions of the

equivalence of online classes to face-to-face classes. MERLOT Journal of Online Learning and Teaching, 10(3), 489-503.

Shen, D., Cho, M. H., Tsai, C. L., & Marra, R. (2013). Unpacking online learning experiences:

Online learning self-efficacy and learning satisfaction. The Internet and Higher Education, 19, 10-17.

Stoebe, A. (2020). The effect of new student orientations on the retention of online students.

Online Journal of Distance Learning Administration, 23(2).

Tichavsky, L. P., Hunt, A. N., Driscoll, A., & Jicha, K. (2015). “It’s just nice having a real

teacher”: Student perceptions of online versus face-to-face instruction. International Journal for the Scholarship of Teaching and Learning, 9(2), 1-8. http://digitalcommons.georgiasouthern.edu/ij-sotl/vol9/iss2/2

Wells, C. N., Pass, M. B., & Walsh, J. E. (2022). Face-to-face vs. online asynchronous teaching in a conservation biology course. Online Learning, 26(2), 241-253. https://doi.org/10.24059/olj.v26i2.2775

Xu, D., & Jaggars, S. (2013). Adaptability to online learning: Differences across types of students and academic subject areas (CCRC Working Paper No. 54). Columbia University, Teachers College, Community College Research Center. http://ccrc.tc.columbia.edu/publications/adaptability-to-online-learning.html

Yukselturk, E., & Bulut, S. (2007). Predictors for student success in an online course. Journal of Educational Technology & Society, 10(2), 71-83.

Ziglari, L. (2017). Interpreting multiple regression results: β weights and structure

coefficients. General Linear Model Journal, 43(2), 13-22. http://www.glmj.org/archives/articles/Ziglari_v43n2.pdf