This breakout study reviews the findings of a 2017 study of Penn State University’s World Campus undergraduate online students. The study surveyed students to report demographic, academic, preferences, and satisfaction information and sought to develop relationships between these variables by their levels of academic success. This breakout study focuses on the findings related to three of the study’s variables: academic advising frequency, interaction preference, and satisfaction of undergraduate online students.
This study is a breakout report of a survey of undergraduate students who pursue higher education online and were enrolled at a large public research university that is recognized for providing superior online education, Penn State University World Campus. The survey asked students to self-report demographic information, academic and employment history, and learning preferences. The study ultimately sought to provide profiles of students categorized by their self-identified inputs aligned with preferred environmental factors that result in success and non-success. Success in this study is defined with a 3.0 grade point average (GPA) or above. Non-success is defined as a GPA of below 2.0. This breakout reviews the relationships of demographics, satisfaction, and preferences of students on the topic of academic advising.
Kuhn (2017) in her study of Pennsylvania State University World Campus undergraduate students found that there is a bigger and even more important story to be told outside of the initial research questions. The basic and descriptive analysis of the sample provided a larger view of the online student population of today. From these findings, a profile based on the most common responses of today’s online student can be developed across demographics, preferences, and outcomes. Today’s online student is female, in the age range of 25 to 39 years of age, an American student, white, non-military, not eligible for ADA, married with zero dependents, and lives with a partner, family, and/or children. They have a GPA of 3.0 or higher and take at least 12 credits per semester while working full-time and at least 30 hours per week in the field they are not currently studying. They have a household income of $50,000 to $74,999 dollars per year and use a combination of debt (e.g., federal loans, private loans, grants) to pay for school. They also have previous college experience, with an average former GPA of at least a 3.0, and typically complete less than 30 credits at former institutions. They pursue education completely online and prefer to do so with zero credits completed residentially. The online student does not utilize tutoring, advisors, nor engage with faculty and prefers a lower level of interaction with both students and faculty in their studies. Students most prefer individual work, moving at their own pace, and deadlines; they least prefer group work, presentations, and papers as assignments.
Even though the data provided the ability to construct a profile from the largest groupings of respondents, it is also worth noting the diversity of each variable from the study. Students span a wide range of characteristics, including age, region, marital status, hours worked per week, annual household income, financial aid status, former college completion credits, number of credits completed online, average credits taken per semester, and satisfaction of online learning. The wide range of responses in these categories provide for a diverse sample. However, even across these differences, a sizable majority of online undergraduate students have similar preferences about interaction with tutors, advisors, faculty, other students, and assignment preferences. These findings show there is not just one type of student pursuing online education at any one institution, and that many of their successes cannot be predicted by their inputs or environment.
Challenges for Online Advising.
Kuh, Kinzie and Schuh and associates (2005) said “advising is viewed as a way to connect students to the campus and help them feel that someone is looking out for them” (p. 214). However, can a connection and relationship form between an advisor and a student at a distance? Varney (2009) noted “the primary challenge when advising from a distance is connecting with the student in such a way that he or she identifies the advisor as the person within the institution who cares.” Therefore, institutions must create alternative ways of communication to help students feel connected to the institution and advisor. Varney (2009) states “the goal of a distance advising program should be to replicate the intimacy of a face-to-face advisor-student relationship from a distance.” Due to this, distance advisors must consider the geographical barrier and determine alternative ways to build rapport.
Another challenge to distance advising is that students need to learn new software required for online learning. Students have reported difficulties navigating platforms (Clay, et al. 2008). Overall computer literacy and navigation skills are imperative for online learning and online advising.
Best Practices for Online Advising.
Nutt (2003) stated academic advisors play a critical role in supporting and engaging students. Students rely on advisors for academic information, assistance in navigating the university, communicating and understanding policies and procedures, and problem solving. (Smith & Allen, 2014). Academic advisors should not be restrictive to only campus-based students. Varney (2009) states “it is an advisor’s job to determine what the student’s needs are and provide support and direction accordingly. In the student’s eyes, the advisor is the resource, while the links and catalogs are the tools.” Institutions can use a variety of procedures and resources to create a supportive advising session for web-based students including: effective communication, online student orientation, and an early warning system.
Electronic communication such as websites, intranet, and social media can be used to communicate. However, these resources cannot simply replace the advising relationship between the student and academic advisor. (Gaines, 2014). Email, phone, and video advising are common practices for distance advisors.
While email advising, Ohrablo (2016) suggests that advisors should respond to emails in the same manner they would in-person advising. Further, an advisor can replicate in-person advising by responding with an additional question such as “Do you know what classes you will be taking next semester, or, “do you know where to find the schedule of classes”.
Phone advising is another alternative for distance advisors. Phone advising is a great replacement to face-to-face advising because an advisor can continue with their typical advising agenda and continue to explain program requirements, institutional policies and campus resources offered online such as library services. However, it is important that advisors consider inflection of voice to determine any issues since visual cues are not seen.
Lastly, advising via videoconference almost effectively replaces in-person advising. Various platforms allow advisors to even “screen share” with students to assist with degree planning. Ohrablo (2016) suggests videoconferencing can eliminate distractions for students unlike phone calls because students may be driving or working while speaking on the telephone.
Campus based students often attend an on-campus orientation to aid in their transition to the academic and social aspects of higher education. Orientation experiences should not be unique to campus-based students. A “way to build meaningful connections with students early in their academic careers is to offer a comprehensive orientation that includes an explanation of systems, technology, planning sheets, and anything else students need to know in order to be successful.” (Varney, 2009) Orientation for online students need to be user-friendly and allow students to access the information whenever their schedule allows. This can be completed via podcasts or videos. Additionally, orientation should cover information such as: the nature of online courses, how to log onto the course, information about proctored testing, how to get help, checking university email, and directions for payment of coursework. Wojciechowski and Palmer (2005) found a strong connection between students who attended the optional orientation session and success in the online classroom. This connection caused the researchers to recommend the attendance of the optional orientation to be mandatory. Due to challenges in technological literacy, it would be best for online orientation to be held via the university’s online classroom platform. This would allow for increased familiarization of the platform prior to the first day of class.
Early Warning System.
Another strategy that can be used for effective distance advising is to track at risk online students. Utilizing a system that tracks student performance during coursework and communicating low performance to advisors will allow advisors to connect with the student before failure. Varney (2009) suggests early communication can increase student persistence and improve performance. Additionally, faculty can be involved in this communication, which helps provide support, particularly if tutoring services are not available.
Academic Advising Structure at Pennsylvania State University World Campus
Pennsylvania State University World Campus students are not required to meet with the advisor each semester, however it is strongly encouraged. Advisors discuss upcoming course scheduling, the academic calendar and deadlines, and academic resources. Advisors will send monthly communication via email of reminders and important dates. Students on academic warning or probation are required to meet with an advisor due to a hold on their account to discuss techniques for success. Advisors are available via email, telephone, and videoconferencing. However, students typically prefer email and telephone. A caseload is approximately 250 students per advisor and is determined by a student’s major.
Conceptual Framework – Astin’s I-E-O ModelThe I-E-O model was developed by Alexander Astin to serve as a conceptual guide for studying college student development. The foundation of the model is based on the elements of inputs (I), environment (E), and outcomes (O) and the interaction between the three. Inputs are defined as characteristics of the student at the time of initial entry; environment is defined as the various programs, policies, faculty, peers, and educational experiences to which the student is exposed; and outcomes refer to the student’s characteristics after exposure to the environment (Astin, 1993). Astin (1993) states the “purpose of the model is to assess the impact of various environmental experiences by determining whether students grow or change differently under varying environmental conditions.” Astin and Antonio (2012) believe that the environmental information is the most critical in the model as the environment includes those things that the educator and institution directly control to develop the student’s outcomes.
Figure 1. Astin’s I-E-O Model
The variables can be used in different ways, but Astin and Antonio (2012) refer to the outcomes as dependent variables, environmental and inputs as independent variables and also inputs as control variables. The arrows in the above Figure of the model represent the relationships of the variables with relationship B (environment and outcomes) as the most important for assessment and evaluation of education (Astin & Antonio, 2012). However, the relationship between environments and outcomes cannot be explained without consideration of the student inputs that can be related to both outcomes (relationship C) and environments (relationship A). Since inputs can be related to both outcomes and environments, inputs can then affect the observed relationship between environments and outcomes. This design allows for educators to measure input characteristics of each student and then correct/adjust these input differences to get a less biased estimate of the comparative effects of different environments on outcomes (Astin & Antonio, 2012). The combination of the three types of variables is necessary because the input and outcome data of students is of lesser or limited value if it is not known what occurs during the students learning over the same period of time. The choice of environmental factors to consider in the model are those that you may change or control because if that is possible then it may be possible to improve these factors to influence more desirable outcomes for inputs as a whole or in groups.
Current literature available regarding online education and online students is very limited to those of demographics, history of online education, and various pedagogies throughout the history of online education. However, none of these studies connect the demographics to the pedagogies recommended for online education, which is why this study seeks to uncover the preferences and practices of students in the online environment.
The larger study included the following four research questions:
- What input characteristics are highly correlated with online students’ educational environments for a desired outcome of 3.0 GPA or higher?
- What input characteristics are highly correlated with online students’ educational environments for an outcome of 2.1-2.9 GPA?
- What input characteristics are highly correlated with online students’ educational environments for a negative outcome of a 2.0 GPA or lower?
- How similar or different are online students based on environmental preferences regarding their input profile and levels of success?
- To what extent is online students’ satisfaction with online courses highly correlated with their achievement?
This breakout study focuses on the results of these questions particular to the environmental variables of advisor engagement. Advisor engagement is defined as the student’s identification of utilizing an advisor provided by the institution, how many times they have utilized their advisor, and if they prefer to use the advisor more or less on a binomial scale.
Data was collected directly from students anonymously via the online survey tool Qualtrics. The survey was distributed through faculty prompting in the Spring 2017 semester online courses. Instructors listed as teaching courses in the Spring 2017 semester for all programs of Baccalaureate degree programs were contacted via their official University email to distribute the survey to their students for participation. The importance of this kind of research and the innovative nature of the study were stressed to both faculty and students to encourage participation. The faculty members had the choice to propose the study to their students for participation. A sample message was provided for faculty to use when communicating about the study with their students.
Using the University Catalog, 587 unique instructors of record for online Spring 2017 undergraduate courses were identified and contacted. The researcher received 46 positive responses indicating support of the study through distribution to their courses. The study did not require response from faculty members so it is possible more than 46 responding instructors distributed the survey to their students. The number of students in each of these courses was unavailable publicly and therefore the total number of students participated compared to all that could have participated is unknown.
Participants provided consent at the opening of the survey. If they declined consent, they were sent to a thank you page of the survey, skipping all questions. If they did not select yes or no for consent, it was assumed they consented when providing information.
Participants experienced no coercion or undue influence in the consent process through the ability to opt-out or decline consent at any time. Participants were able to withdraw from the study at any time and all other data contributed to the study was not used. There was no follow-up with withdrawn participants. The survey included 36 questions and was estimated to take no more than ten minutes to complete.
Eligible participants were defined as undergraduate students enrolled in undergraduate level online courses during the Spring 2017 semester above the age of 18. Those that did not meet inclusion criteria were excluded from the study. A total of 1,058 responses were recorded in the time period of the first two weeks of classes January 9 to January 21, 2017. 339 of the responses were partial responses and were removed from the sample. Eight respondents answered “no” as consent and were removed. Three of the respondents indicated a birth year of 1999 which provided the possibility that they were under the age 18 at the time of survey, since no birth month was collected; these also were eliminated from the sample. Respondents who indicated they were in a master’s degree program (n = 20) were also eliminated from the sample, for a final sample count of 688.
Data were analyzed using basic statistical techniques such as mean, median, and range, and more advanced techniques such as correlation. Statistical analysis was completed in the software packages Microsoft Excel and MiniTab. Analysis sought to find relationships on a few different levels. These levels consist of relationships between variables representing inputs and environment, inputs and outcomes, environment and outcomes, and inputs, environment, and outcomes.
Frequency of Advising.
Table 1. Frequency of Advising
(n = 616)
The frequency with which students utilize advising is positively correlated with three input variables at a significant level. Frequency of advising is very weak but positively correlated with hours per week worked and marital status. Age range of students’ and if the student is eligible for ADA support is also weak but positively correlated with the frequency in which they utilize advising.
Table 2. Correlation: Inputs-Environment, GPA of 3.0 or higher
Table 3. Correlation: Inputs-Environment, GPA of lower than 2.0
The frequency in which a student uses advisors has a strong, positive relationship with race for those with a GPA of below a 2.0.
Table 4. Characteristics between GPA: Environments
The preference and satisfaction of students was measured using a Net Promoter Score (NPS), which ranges from 0 to 10. Respondents identified their preference to interact with advisors more or less using a scale of 0 to 10; 0 being much less, 5 being neither more or less, and 10 being much more. The responses were grouped using NPS, where scores of 0 to 6 qualify as a detractor, 7 to 8 as passive, and 9 or 10 as promoters. NPS is a measure of customer experience and the groupings of promoter, passive, and detractor indicates the following (Net Promoter Network, n.d.):
- Promoters are loyal enthusiasts,
- Passives are satisfied but unenthusiastic, and
- Detractors are unhappy customers.
Students were assigned an academic advisor at the institution being surveyed and a large majority of respondents identified as interacting with their assigned advisor as never or as less than one hour. A little more than one-half of respondents scored as detractors for wanting to either interact with advisors more or less on a scale of 0 to 10, with 0 being much less, 5 being neither more or less, and 10 being much more. One-quarter of respondents were passive regarding the interaction of advisors. The average score for interacting with advisors was 6.40, with a median of 6, and a mode of 5. When testing for the difference of proportions, the difference between detractors and promoters was significant with a Z of greater than 1.96.
Table 5. Advisor Interaction Preference NPS
(n = 667)
One-half of the respondents when asked about satisfaction of advising were grouped as detractors, while a quarter were grouped as passive. When testing for the difference of proportions, the difference between detractors and promoters was significant with a Z of greater than 1.96. The average score was a 6.17, with a mode of 5, and median of 6.
Table 6. Advisor Satisfaction NPS
(n = 658)
The outcome of overall satisfaction with online learning was found to be significantly correlated to the frequency of advising at a very weak, positive level.
Table 7. Correlation: Environment-Outcomes, GPA of 3.0 or higher
According to the results, a majority of respondents (78.73%) have never engaged with an advisor, or have engaged with their advisor for a total of less than one hour. Distance advisors are challenged with connecting the student to the university and it can be noted that students do not feel the need to be connected with their advisor. However, advisors provide information specific to their major and degree completion, university policies, and resources, which are integral to timely graduation. If the institution is investing in advising and believes this service is a priority in student development and achievement, institutions should evaluate their advising services, practices, and programming. The requirement of a student having to meet with an advisor for a PIN or hold removal to register could begin to build this culture of advisor-student relationships. As discussed previously, availability and usability for meetings could include weekly schedule, knowledge offered, and delivery of advising (email, phone call, videoconferencing).
Marketing of Student Services: Advising.
Overall, institutions should increase their marketing of advising services to all students with the hope that it will lead to increased utilization. However, if online students are largely successful without utilizing these services, to what extent are institutions wise to invest funding in them? Or, perhaps the marketing message could be modified to entice more students to utilize such services, with the hope that advising, for example, may increase satisfaction and academic success even further.
Ease of Appointment.
Institutions who are invested in advising should consider investing in tools that increase the accessibility to having an advising appointment. For example, institutions should invest in software that provides students the ability to easily view available appointments and schedule an appointment. Additionally, multiple means such as telephone, web chat and video conferencing should be available for students depending upon their preference and accessibility. In addition to having the software for scheduling, institution’s websites should include information on how to use the software. This training can include either photography or videography to explain use of the software for those students who view technology as a challenge. Lastly, since the online student population can include students in various time-zones and students employed full or part time, institutions should consider advisors who work beyond the traditional 9am- 5pm workday. Extending hours into the evening will allow for distance learning and advisors to connect more often.
A majority of the limitations reside in the reliability of college student self-reported gains and perceptions. Pike (n.d.) found that the accuracy and appropriateness of self-report data brings with it questions of reliability and is then hard to create correlations with other measures such as GPA, which is reliable. This is due to the nature of self-reported data and the concern for social desirability bias where the participants will want to respond as either they believe the study would like them to respond or as an act of social pressure (West, 2014). There is also concern for reference bias, which refers to the participants having different standards of comparison for responses (West, 2014).
All data collected was self-reported and had no validation methods with the institution and courses they reported on. Therefore, data could be skewed based on students’ desire to properly identify variables that could be perceived as negative. The assurance of anonymity is essential in providing the most reliable and accurate responses.
A major limitation within the study is the sample size of students below a 2.0 GPA (n = 10). This group represents 2.21% of the total sample where those with a 3.0 or higher is 78.86% of the sample. Therefore, when evaluating the comparisons of the two groups addressed in the research questions, it is important to acknowledge the difference of the sizes of these two groups. This limitation is also addressed in the ‘Future Research’ section of the conclusion.
Mixed-Mode Study of Student Preferences.
A study that includes both quantitative and qualitative data would provide a wider view of students’ preferences, choices, satisfaction, and demographics. The ability to provide the “why” to the question of “How often do you engage with an advisor?” or “At what level do you prefer interaction with other students?” may moderate the possibility of incorrect assumptions made from the findings. A student may prefer individual work, papers, or moving at their own pace, not because they work long hours or have family responsibilities, but because they simply are not good at working in groups or creative projects. This type of student may only interact with advisors when there is no other alternative (e.g. obtaining a pin for registration, completing graduating forms, or appealing a grade). The study could be conducted through a similar method of this dissertation beginning with a quantitative study, and then following up with either focus groups or individual interviews.
A mixed-mode study could also include the examination of motivation. Motivations were not examined within this study. The findings of motivations can attribute more context to why students prefer certain assignments and why they may or may not utilize student services such as advising or tutoring. A research question going forward may be: Are motivations for pursuing undergraduate education online correlated with the practices and preferences of online students?
Study of Institutional Reported Data.
A limitation to this study was the amount of self-reported data provided on a voluntary basis (Pike, n.d.; West, 2014). To increase the validity of these data, having some data such as GPA, gender, age, financial aid status, advising and tutoring frequencies provided directly from the institution would be advantageous. Follow-up such as interviews or focus groups to certain groups as identified from the institutional data could provide information such as satisfaction, preferences, and additional demographics that are not collected by the institution.
As online education continues to grow, particularly at the undergraduate level, the industry needs to be flexible, accommodating, and innovative in the way it serves its students. Students are no longer preferring residential course work in which they engage with university faculty and staff on a regular basis, they are looking for a concierge. The student wants to engage when they need to engage and only when they need to. The opportunity for a relationship between institutional staff / faculty and the online student is limited to the student's desire to interact, which has been found to be minimal. Leaving institutions, researchers, and practitioners to ask “how do we serve online students?”
This study originally sought to find similarities between demographics and preferences, to group students, and design practices around these particular groupings. Even though this study may have failed to create groups based on demographics and preferences, it did successfully create a foundation of examining as a whole how to better serve and satisfy online students. This study also was able to provide contradictions to many assumptions regarding online learners. As only the first step in evaluating student success in online education, this study provides baseline data on demographics and preferences that raised many questions and the results challenged traditional thinking.
Astin, A. & Antonio, A. (2012). Assessment of Excellence: The Philosophy and Practice of Assessment and Evaluation in Higher Education (2nd Ed.). Lanham, Maryland: Rowman & Littlefield Publishers.
Astin, A. (1993). What Matters in College?. San Francisco, CA: Jossey-Bass.
Clay, M.N., Rowland, S., & Packard, A. (2008). Improving undergraduate online retention through gated advisement and redundant communication. Journal of College Student Retention, 10(1), 93-102.
Gaines, T. (2014). Technology and academic advising: Student usage and preferences. NACADA Journal, 34(1), 43-49.
Kuh, G.D., Kinzie, J., Schuh, J.H. and associates. (2005).Student Success in College: Creating conditions that matter.San Francisco: Jossey-Bass.
Kuhn, M. (2017). Understanding Online Student Success Through The Use Of Astin’s IEO Model. Retrieved from: https://etda.libraries.psu.edu/catalog/13978mlm629
Net Promoter Network. (n.d.). What is Net Promoter?. Retrieved February 25, 2017: https://www.netpromoter.com/know/
Nutt, Charlie L. (2003). Academic advising and student retention and persistence. Retrieved from the NACADA Clearinghouse of Academic Advising Resources website: http://www.nacada.ksu.edu/tabi...
Ohrablo, S. (2016). Advising online students: Replicating best practices of face-to-face advising. Retrieved from the NACADA Clearinghouse of Academic Advising Resources
Serge Herzog, & Nicholas A. Bowman. (2011). Validity and Limitations of College Student Self-Report Data : New Directions for Institutional Research, Number 150. Jossey-Bass.
Smith, C. L., & Allen, J. M. (2014). Does contact with advisors predict judgments and attitudes consistent with student success? A multi-institutional study. NACADA Journal, 34(1), 50–63.
Varney, J. (2009). Strategies for success in distance advising. Retrieved from the NACADA Clearinghouse of Academic Advising Resources Web site.
West, M., (2014). The Limitations of Self-Report Measures of Non-cognitive Skills, SERIES: The Brown Center Chalkboard Series Archive, Number 92 of 115. Retrieved from: http://www.brookings.edu/research/papers/2014/12/18-chalkboard-non-cognitive-west
Wojciechowski, A., & Palmer, L. B. (2005). Individual student characteristics: Can any be predictors of success in online classes? Online Journal of Distance Learning Administration, 8(2). Retrieved July 18th, 2018